2015-04-13
cope with dynamic, online optimisation problems with uncertainty, we developed some powerful and sophisticated techniques for learning heuristics...NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) National ICT Australia United NICTA, Locked Bag 6016 Kensington...ABSTRACT Optimization solvers should learn to improve their performance over time. By learning both during the course of solving an optimization
Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall
2016-01-01
Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United Statesâ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...
Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates
Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...
Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty
NASA Technical Reports Server (NTRS)
Mather, Janice L.; Taylor, Shawn C.
2015-01-01
In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yishen; Zhou, Zhi; Liu, Cong
2016-08-01
As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarez-Ramirez, J.; Aguilar, R.; Lopez-Isunza, F.
FCC processes involve complex interactive dynamics which are difficult to operate and control as well as poorly known reaction kinetics. This work concerns the synthesis of temperature controllers for FCC units. The problem is addressed first for the case where perfect knowledge of the reaction kinetics is assumed, leading to an input-output linearizing state feedback. However, in most industrial FCC units, perfect knowledge of reaction kinetics and composition measurements is not available. To address the problem of robustness against uncertainties in the reaction kinetics, an adaptive model-based nonlinear controller with simplified reaction models is presented. The adaptive strategy makes usemore » of estimates of uncertainties derived from calorimetric (energy) balances. The resulting controller is similar in form to standard input-output linearizing controllers and can be tuned analogously. Alternatively, the controller can be tuned using a single gain parameter and is computationally efficient. The performance of the closed-loop system and the controller design procedure are shown with simulations.« less
2011-01-01
5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Rand Corporation ,Arroyo Center,PO Box...2138, 1776 Main Street,Santa Monica,CA,90407-2138 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES...research, development, test , and evaluation programs; and those who are interested in the optimal allocation of funds among different programs and/or
Fuel cycle cost uncertainty from nuclear fuel cycle comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J.; McNelis, D.; Yim, M.S.
2013-07-01
This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for themore » discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.« less
Neoliberalism and Illusion: The Importance of Preparing Students to Live in the 21st Century
ERIC Educational Resources Information Center
Fitzner, Jennifer
2017-01-01
While this paper discusses the history of neoliberalism with emphasis on the role of the United States, it also addresses the challenges neoliberalism poses for individuals. Additionally, the paper discusses the failure of school curriculum to prepare youth in the United States for growing economic uncertainty, as well as media's role in hindering…
CODATA Fundamental Physical Constants
National Institute of Standards and Technology Data Gateway
SRD 121 NIST CODATA Fundamental Physical Constants (Web, free access) This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youngren, M.A.
1989-11-01
An analytic probability model of tactical nuclear warfare in the theater is presented in this paper. The model addresses major problems associated with representing nuclear warfare in the theater. Current theater representations of a potential nuclear battlefield are developed in context of low-resolution, theater-level models or scenarios. These models or scenarios provide insufficient resolution in time and space for modeling a nuclear exchange. The model presented in this paper handles the spatial uncertainty in potentially targeted unit locations by proposing two-dimensional multivariate probability models for the actual and perceived locations of units subordinate to the major (division-level) units represented inmore » theater scenarios. The temporal uncertainty in the activities of interest represented in our theater-level Force Evaluation Model (FORCEM) is handled through probability models of the acquisition and movement of potential nuclear target units.« less
Air Superiority by the Numbers: Cutting Combat Air Forces in a Time of Uncertainty
2014-06-01
5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) School of Advanced Air And Space Studies,,Air University,,Maxwell Air...iii ACKNOWLEDGMENTS I would first like to thank the men and women whom I have had the...interdiction role. Finally, in the midst of a growing Soviet radar-guided surface-to-air threat, “Skunk Works ” developed the first stealth attack
Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment
Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.
2012-01-01
Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278
Crisis Speeches Delivered during World War II: A Historical and Rhetorical Perspective
ERIC Educational Resources Information Center
Ramos, Tomas E.
2010-01-01
Rhetorical analyses of speeches made by United States presidents and world leaders abound, particularly studies about addresses to nations in times of crisis. These are important because what presidents say amidst uncertainty and chaos defines their leadership in the eyes of the public. But with new forms of crisis rhetoric, our understanding of…
Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Christopher R. Webster; Brian J. Palik; Mark J. Twery; John B. Bradford; Linda R. Parker; Andrea T. Hille; Sheela M. Johnson
2011-01-01
Land managers across the country face the immense challenge of developing and applying appropriate management strategies as forests respond to climate change. We hosted a workshop to explore silvicultural strategies for addressing the uncertainties surrounding climate change and forest response in the northeastern and north-central United States. Outcomes of this...
Life cycle analysis of fuel production from fast pyrolysis of biomass.
Han, Jeongwoo; Elgowainy, Amgad; Dunn, Jennifer B; Wang, Michael Q
2013-04-01
A well-to-wheels (WTW) analysis of pyrolysis-based gasoline was conducted and compared with petroleum gasoline. To address the variation and uncertainty in the pyrolysis pathways, probability distributions for key parameters were developed with data from literature. The impacts of two different hydrogen sources for pyrolysis oil upgrading and of two bio-char co-product applications were investigated. Reforming fuel gas/natural gas for H2 reduces WTW GHG emissions by 60% (range of 55-64%) compared to the mean of petroleum fuels. Reforming pyrolysis oil for H2 increases the WTW GHG emissions reduction up to 112% (range of 97-126%), but reduces petroleum savings per unit of biomass used due to the dramatic decline in the liquid fuel yield. Thus, the hydrogen source causes a trade-off between GHG reduction per unit fuel output and petroleum displacement per unit biomass used. Soil application of biochar could provide significant carbon sequestration with large uncertainty. Copyright © 2013 Elsevier Ltd. All rights reserved.
2012-12-01
M. A.; Horstemeyer, M. F.; Gao, F.; Sun, X.: Khaleel, M. Scripta Materialia. 2011, 64, 908. 80. Plimpton , S . Journal of Computational Physics...99 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Mark Tschopp,* Fei Gao,** and Xin Sun** 5d. PROJECT NUMBER 5e. TASK NUMBER...5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) U.S. Army Research Laboratory ATTN: RDRL-WMM-F Aberdeen Proving Ground
Tian, Zhen; Yuan, Jingqi; Zhang, Xiang; Kong, Lei; Wang, Jingcheng
2018-05-01
The coordinated control system (CCS) serves as an important role in load regulation, efficiency optimization and pollutant reduction for coal-fired power plants. The CCS faces with tough challenges, such as the wide-range load variation, various uncertainties and constraints. This paper aims to improve the load tacking ability and robustness for boiler-turbine units under wide-range operation. To capture the key dynamics of the ultra-supercritical boiler-turbine system, a nonlinear control-oriented model is developed based on mechanism analysis and model reduction techniques, which is validated with the history operation data of a real 1000 MW unit. To simultaneously address the issues of uncertainties and input constraints, a discrete-time sliding mode predictive controller (SMPC) is designed with the dual-mode control law. Moreover, the input-to-state stability and robustness of the closed-loop system are proved. Simulation results are presented to illustrate the effectiveness of the proposed control scheme, which achieves good tracking performance, disturbance rejection ability and compatibility to input constraints. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Human Health Risk Assessment of Pharmaceuticals in Water: Issues and Challenges Ahead
Kumar, Arun; Chang, Biao; Xagoraraki, Irene
2010-01-01
This study identified existing issues related to quantitative pharmaceutical risk assessment (QPhRA, hereafter) for pharmaceuticals in water and proposed possible solutions by analyzing methodologies and findings of different published QPhRA studies. Retrospective site-specific QPhRA studies from different parts of the world (U.S.A., United Kingdom, Europe, India, etc.) were reviewed in a structured manner to understand different assumptions, outcomes obtained and issues, identified/addressed/raised by the different QPhRA studies. Till date, most of the published studies have concluded that there is no appreciable risk to human health during environmental exposures of pharmaceuticals; however, attention is still required to following identified issues: (1) Use of measured versus predicted pharmaceutical concentration, (2) Identification of pharmaceuticals-of-concern and compounds needing special considerations, (3) Use of source water versus finished drinking water-related exposure scenarios, (4) Selection of representative exposure routes, (5) Valuation of uncertainty factors, and (6) Risk assessment for mixture of chemicals. To close the existing data and methodology gaps, this study proposed possible ways to address and/or incorporation these considerations within the QPhRA framework; however, more research work is still required to address issues, such as incorporation of short-term to long-term extrapolation and mixture effects in the QPhRA framework. Specifically, this study proposed a development of a new “mixture effects-related uncertainty factor” for mixture of chemicals (i.e., mixUFcomposite), similar to an uncertainty factor of a single chemical, within the QPhRA framework. In addition to all five traditionally used uncertainty factors, this uncertainty factor is also proposed to include concentration effects due to presence of different range of concentration levels of pharmaceuticals in a mixture. However, further work is required to determine values of all six uncertainty factors and incorporate them to use during estimation of point-of-departure values within the QPhRA framework. PMID:21139869
Evaluation of the Uncertainty in JP-7 Kinetics Models Applied to Scramjets
NASA Technical Reports Server (NTRS)
Norris, A. T.
2017-01-01
One of the challenges of designing and flying a scramjet-powered vehicle is the difficulty of preflight testing. Ground tests at realistic flight conditions introduce several sources of uncertainty to the flow that must be addressed. For example, the scales of the available facilities limit the size of vehicles that can be tested and so performance metrics for larger flight vehicles must be extrapolated from ground tests at smaller scales. To create the correct flow enthalpy for higher Mach number flows, most tunnels use a heater that introduces vitiates into the flow. At these conditions, the effects of the vitiates on the combustion process is of particular interest to the engine designer, where the ground test results must be extrapolated to flight conditions. In this paper, the uncertainty of the cracked JP-7 chemical kinetics used in the modeling of a hydrocarbon-fueled scramjet was investigated. The factors that were identified as contributing to uncertainty in the combustion process were the level of flow vitiation, the uncertainty of the kinetic model coefficients and the variation of flow properties between ground testing and flight. The method employed was to run simulations of small, unit problems and identify which variables were the principal sources of uncertainty for the mixture temperature. Then using this resulting subset of all the variables, the effects of the uncertainty caused by the chemical kinetics on a representative scramjet flow-path for both vitiated (ground) and nonvitiated (flight) flows were investigated. The simulations showed that only a few of the kinetic rate equations contribute to the uncertainty in the unit problem results, and when applied to the representative scramjet flowpath, the resulting temperature variability was on the order of 100 K. Both the vitiated and clean air results showed very similar levels of uncertainty, and the difference between the mean properties were generally within the range of uncertainty predicted.
Task uncertainty and communication during nursing shift handovers.
Mayor, Eric; Bangerter, Adrian; Aribot, Myriam
2012-09-01
We explore variations in handover duration and communication in nursing units. We hypothesize that duration per patient is higher in units facing high task uncertainty. We expect both topics and functions of communication to vary depending on task uncertainty. Handovers are changing in modern healthcare organizations, where standardized procedures are increasingly advocated for efficiency and reliability reasons. However, redesign of handover should take environmental contingencies of different clinical unit types into account. An important contingency in institutions is task uncertainty, which may affect how communicative routines like handover are accomplished. Nurse unit managers of 80 care units in 18 hospitals were interviewed in 2008 about topics and functions of handover communication and duration in their unit. Interviews were content-analysed. Clinical units were classified into a theory-based typology (unit type) that gradually increases on task uncertainty. Quantitative analyses were performed. Unit type affected resource allocation. Unit types facing higher uncertainty had higher handover duration per patient. As expected, unit type also affected communication content. Clinical units facing higher uncertainty discussed fewer topics, discussing treatment and care and organization of work less frequently. Finally, unit type affected functions of handover: sharing emotions was less often mentioned in unit types facing higher uncertainty. Task uncertainty and its relationship with functions and topics of handover should be taken into account during the design of handover procedures. © 2011 Blackwell Publishing Ltd.
Report from the Workshop on Coregonine Restoration Science
Bronte, Charles R.; Bunnell, David B.; David, Solomon R.; Gordon, Roger; Gorsky, Dimitry; Millard, Michael J.; Read, Jennifer; Stein, Roy A.; Vaccaro, Lynn
2017-08-03
SummaryGreat Lakes fishery managers have the opportunity and have expressed interest in reestablishing a native forage base in the Great Lakes consisting of various forms and species within the genus Coregonus. This report summarizes the proceedings of a workshop focused on a subset of the genus, and the term “coregonines” is used to refer to several species of deepwater ciscoes (also known as “chubs”) and the one more pelagic-oriented cisco species (Coregonus artedi, also known as “lake herring”). As the principal conservation agency for the United States Government, the Department of Interior has unique and significant authorities and capacities to support a coregonine reestablishment program in the Great Lakes. To identify and discuss key uncertainties associated with such a program and develop a coordinated approach, the U.S. Geological Survey (USGS) and the U.S. Fish and Wildlife Service (FWS), the principal Department of the Interior bureaus to address Great Lakes fishery issues, held the first of a series of workshops on coregonine science in Ann Arbor, Michigan, on October 11–13, 2016. Workshop objectives were to identify (1) perceived key uncertainties associated with coregonine restoration in the Great Lakes and (2) DOI capacities for addressing these key uncertainties.
Addressing uncertainty in vulnerability assessments [Chapter 5
Linda Joyce; Molly Cross; Evan Girvatz
2011-01-01
This chapter addresses issues and approaches for dealing with uncertainty specifically within the context of conducting climate change vulnerability assessments (i.e., uncertainties related to identifying and modeling the sensitivities, levels of exposure, and adaptive capacity of the assessment targets).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
Cramer, C.H.
2006-01-01
The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.
US Efforts in Support of Examinations at Fukushima Daiichi – 2016 Evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amway, P.; Andrews, N.; Bixby, Willis
Although it is clear that the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limitedmore » full scale prototypic data. Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings (TEPCO) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document reports recent results from the US Forensics Effort to use information obtained by TEPCO to enhance the safety of existing and future nuclear power plant designs. This Forensics Effort, which is sponsored by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of US experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO information from Daiichi that address these needs. Examples presented in this report demonstrate that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities.« less
Research strategies for addressing uncertainties
Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah
2013-01-01
Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Z.; Liu, C.; Botterud, A.
Renewable energy resources have been rapidly integrated into power systems in many parts of the world, contributing to a cleaner and more sustainable supply of electricity. Wind and solar resources also introduce new challenges for system operations and planning in terms of economics and reliability because of their variability and uncertainty. Operational strategies based on stochastic optimization have been developed recently to address these challenges. In general terms, these stochastic strategies either embed uncertainties into the scheduling formulations (e.g., the unit commitment [UC] problem) in probabilistic forms or develop more appropriate operating reserve strategies to take advantage of advanced forecastingmore » techniques. Other approaches to address uncertainty are also proposed, where operational feasibility is ensured within an uncertainty set of forecasting intervals. In this report, a comprehensive review is conducted to present the state of the art through Spring 2015 in the area of stochastic methods applied to power system operations with high penetration of renewable energy. Chapters 1 and 2 give a brief introduction and overview of power system and electricity market operations, as well as the impact of renewable energy and how this impact is typically considered in modeling tools. Chapter 3 reviews relevant literature on operating reserves and specifically probabilistic methods to estimate the need for system reserve requirements. Chapter 4 looks at stochastic programming formulations of the UC and economic dispatch (ED) problems, highlighting benefits reported in the literature as well as recent industry developments. Chapter 5 briefly introduces alternative formulations of UC under uncertainty, such as robust, chance-constrained, and interval programming. Finally, in Chapter 6, we conclude with the main observations from our review and important directions for future work.« less
NASA Technical Reports Server (NTRS)
Ganguly, Sangram; Kalia, Subodh; Li, Shuang; Michaelis, Andrew; Nemani, Ramakrishna R.; Saatchi, Sassan A
2017-01-01
Uncertainties in input land cover estimates contribute to a significant bias in modeled above ground biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.
Huang, Guowen; Lee, Duncan; Scott, E Marian
2018-03-30
The long-term health effects of air pollution are often estimated using a spatio-temporal ecological areal unit study, but this design leads to the following statistical challenges: (1) how to estimate spatially representative pollution concentrations for each areal unit; (2) how to allow for the uncertainty in these estimated concentrations when estimating their health effects; and (3) how to simultaneously estimate the joint effects of multiple correlated pollutants. This article proposes a novel 2-stage Bayesian hierarchical model for addressing these 3 challenges, with inference based on Markov chain Monte Carlo simulation. The first stage is a multivariate spatio-temporal fusion model for predicting areal level average concentrations of multiple pollutants from both monitored and modelled pollution data. The second stage is a spatio-temporal model for estimating the health impact of multiple correlated pollutants simultaneously, which accounts for the uncertainty in the estimated pollution concentrations. The novel methodology is motivated by a new study of the impact of both particulate matter and nitrogen dioxide concentrations on respiratory hospital admissions in Scotland between 2007 and 2011, and the results suggest that both pollutants exhibit substantial and independent health effects. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kalia, S.; Li, S.; Michaelis, A.; Nemani, R. R.; Saatchi, S.
2017-12-01
Uncertainties in input land cover estimates contribute to a significant bias in modeled above gound biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition/ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree/non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial/satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti
2017-08-01
Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.
A Practical Approach to Address Uncertainty in Stakeholder Deliberations.
Gregory, Robin; Keeney, Ralph L
2017-03-01
This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.
Two-Stage Bayesian Model Averaging in Endogenous Variable Models*
Lenkoski, Alex; Eicher, Theo S.; Raftery, Adrian E.
2013-01-01
Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471
Visual Semiotics & Uncertainty Visualization: An Empirical Study.
MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M
2012-12-01
This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.
Seidl, Rupert; Lexer, Manfred J
2013-01-15
The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chiu, W. A.; Bachmaier, J.; Bastian, R.; Hogan, R.; Lenhart, T.; Schmidt, D.; Wolbarst, A.; Wood, R.; Yu, C.
2002-05-01
Managing municipal wastewater at publicly owned treatment works (POTWs) leads to the production of considerable amounts of residual solid material, which is known as sewage sludge or biosolids. If the wastewater entering a POTW contains radioactive material, then the treatment process may concentrate radionuclides in the sludge, leading to possible exposure of the general public or the POTW workers. The Sewage Sludge Subcommittee of the Interagency Steering Committee on Radiation Standards (ISCORS), which consists of representatives from the Environmental Protection Agency, the Nuclear Regulatory Commission, the Department of Energy, and several other federal, state, and local agencies, is developing guidance for POTWs on the management of sewage sludge that may contain radioactive materials. As part of this effort, they are conducting an assessment of potential radiation exposures using the Department of Energy's RESidual RADioactivity (RESRAD) family of computer codes developed by Argonne National Laboratory. This poster describes several approaches used by the Subcommittee to address the uncertainties associated with their assessment. For instance, uncertainties in the source term are addressed through a combination of analytic and deterministic computer code calculations. Uncertainties in the exposure pathways are addressed through the specification of a number of hypothetical scenarios, some of which can be scaled to address changes in exposure parameters. In addition, the uncertainty in some physical and behavioral parameters are addressed through probabilistic methods.
NASA Technical Reports Server (NTRS)
Schubert, Siegfried; Wang, Hailan; Suarez, Max; Koster, Randal
2010-01-01
The USCLIV AR working group on drought recently initiated a series of global climate model simulations forced with idealized SST anomaly patterns, designed to address a number of uncertainties regarding the impact of SST forcing and the role of land-atmosphere feedbacks on regional drought. The runs were done with several global atmospheric models including NASA/NSIPP-l, NCEP/GFS, GFDLlAM2, and NCAR CCM3 and CAM3.5. Specific questions that the runs are designed to address include: What are mechanisms that maintain drought across the seasonal cycle and from one year to the next. To what extent can droughts develop independently of ocean variability due to year-to-year memory that may be inherent to the land. What is the role of the different ocean basins? Here we focus on the potential predictability of drought conditions over the United States. Specific issues addressed include the seasonality and regionality of the signal-to-noise ratios associated with Pacific and Atlantic SST forcing, and the sensitivity of the results to the climatological stationary waves simulated by the different AGCMs.
NASA Astrophysics Data System (ADS)
Sleeter, B. M.; Rayfield, B.; Liu, J.; Sherba, J.; Daniel, C.; Frid, L.; Wilson, T. S.; Zhu, Z.
2016-12-01
Since 1970, the combined changes in land use, land management, climate, and natural disturbances have dramatically altered land cover in the United States, resulting in the potential for significant changes in terrestrial carbon storage and flux between ecosystems and the atmosphere. Processes including urbanization, agricultural expansion and contraction, and forest management have had impacts - both positive and negative - on the amount of natural vegetation, the age structure of forests, and the amount of impervious cover. Anthropogenic change coupled with climate-driven changes in natural disturbance regimes, particularly the frequency and severity of wildfire, together determine the spatio-temporal patterns of land change and contribute to changing ecosystem carbon dynamics. Quantifying this effect and its associated uncertainties is fundamental to developing a rigorous and transparent carbon monitoring and assessment programs. However, large-scale systematic inventories of historical land change and their associated uncertainties are sparse. To address this need, we present a newly developed modeling framework, the Land Use and Carbon Scenario Simulator (LUCAS). The LUCAS model integrates readily available high quality, empirical land-change data into a stochastic space-time simulation model representing land change feedbacks on carbon cycling in terrestrial ecosystems. We applied the LUCAS model to estimate regional scale changes in carbon storage, atmospheric flux, and net biome production in 84 ecological regions of the conterminous United States for the period 1970-2015. The model was parameterized using a newly available set of high resolution (30 m) land-change data, compiled from Landsat remote sensing imagery, including estimates of uncertainty. Carbon flux parameters for each ecological region were derived from the IBIS dynamic global vegetation model with full carbon cycle accounting. This paper presents our initial findings describing regional and temporal changes and variability in carbon storage and flux resulting from land use change and disturbance between 1973 and 2015. Additionally, based on stochastic simulations we quantify and present key sources of uncertainty in the estimation of terrestrial ecosystem carbon dynamics.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
NASA atmospheric effects of aviation projects: Status and plans
NASA Technical Reports Server (NTRS)
Wesoky, Howard L.; Thompson, Anne M.; Stolarski, Richard S.
1994-01-01
NASA's Atmospheric Effects of Aviation Project is developing a scientific basis for assessment of the atmospheric impact of subsonic and supersonic aviation. Issues addressed include predicted ozone changes and climatic impact, and related uncertainties. A primary goal is to assist assessments of United Nations scientific organizations and, hence, consideration of emission standards by the International Civil Aviation Organization. Project focus is on simulation of atmospheric processes by computer models, but studies of aircraft operations, laboratory studies, and remote and in situ observations of chemical, dynamic, and radiative processes are also included.
Linear Programming Problems for Generalized Uncertainty
ERIC Educational Resources Information Center
Thipwiwatpotjana, Phantipa
2010-01-01
Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…
Numerical Uncertainty Quantification for Radiation Analysis Tools
NASA Technical Reports Server (NTRS)
Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha
2007-01-01
Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.
Dealing with uncertainties in environmental burden of disease assessment
2009-01-01
Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963
NASA Astrophysics Data System (ADS)
Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo
2017-08-01
The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.
Wong, Pauline; Liamputtong, Pranee; Koch, Susan; Rawson, Helen
2017-12-01
To discuss families' experiences of their interactions when a relative is admitted unexpectedly to an Australian intensive care unit. The overwhelming emotions associated with the unexpected admission of a relative to an intensive care unit are often due to the uncertainty surrounding the condition of their critically ill relative. There is limited in-depth understanding of the nature of uncertainty experienced by families in intensive care, and interventions perceived by families to minimise their uncertainty are not well documented. Furthermore, the interrelationships between factors, such as staff-family interactions and the intensive care unit environment, and its influence on families' uncertainty particularly in the context of the Australian healthcare system, are not well delineated. A grounded theory methodology was adopted for the study. Data were collected between 2009-2013, using in-depth interviews with 25 family members of 21 critically ill patients admitted to a metropolitan, tertiary-level intensive care unit in Australia. This paper describes the families experiences of heightened emotional vulnerability and uncertainty when a relative is admitted unexpectedly to the intensive care unit. Families uncertainty is directly influenced by their emotional state, the foreign environment and perceptions of being 'kept in the dark', as well as the interrelationships between these factors. Staff are offered an improved understanding of the barriers to families' ability to regain control, guided by a grounded theory of family resilience in the intensive care unit. The findings reveal in-depth understanding of families' uncertainty in intensive care. It suggests that intensive care unit staff need to focus clinical interventions on reducing factors that heighten their uncertainty, while optimising strategies that help alleviate it. Families are facilitated to move beyond feelings of helplessness and loss of control, and cope better with their situation. © 2017 John Wiley & Sons Ltd.
Christie, Janice; Gray, Trish A; Dumville, Jo C; Cullum, Nicky A
2018-01-01
Complex wounds such as leg and foot ulcers are common, resource intensive and have negative impacts on patients' wellbeing. Evidence-based decision-making, substantiated by high quality evidence such as from systematic reviews, is widely advocated for improving patient care and healthcare efficiency. Consequently, we set out to classify and map the extent to which up-to-date systematic reviews containing robust evidence exist for wound care uncertainties prioritised by community-based healthcare professionals. We asked healthcare professionals to prioritise uncertainties based on complex wound care decisions, and then classified 28 uncertainties according to the type and level of decision. For each uncertainty, we searched for relevant systematic reviews. Two independent reviewers screened abstracts and full texts of reviews against the following criteria: meeting an a priori definition of a systematic review, sufficiently addressing the uncertainty, published during or after 2012, and identifying high quality research evidence. The most common uncertainty type was 'interventions' 24/28 (85%); the majority concerned wound level decisions 15/28 (53%) however, service delivery level decisions (10/28) were given highest priority. Overall, we found 162 potentially relevant reviews of which 57 (35%) were not systematic reviews. Of 106 systematic reviews, only 28 were relevant to an uncertainty and 18 of these were published within the preceding five years; none identified high quality research evidence. Despite the growing volume of published primary research, healthcare professionals delivering wound care have important clinical uncertainties which are not addressed by up-to-date systematic reviews containing high certainty evidence. These are high priority topics requiring new research and systematic reviews which are regularly updated. To reduce clinical and research waste, we recommend systematic reviewers and researchers make greater efforts to ensure that research addresses important clinical uncertainties and is of sufficient rigour to inform practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Brennan T
2015-01-01
Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Basedmore » on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then combined with a bi-cubic spline interpolation method which converts point velocities into a continuous velocity distribution over the measurement domain. Subsequently the number and placement of current meters are simulated to investigate the accuracy of the estimated flow rates using the numerical velocity-area integration method outlined in ISO 3354 [12]. The authors herein consider that statistics on generated flow rates processed with bi-cubic interpolation and sensor simulations are the combined uncertainties which already accounted for the effects of all those three uncertainty sources. A preliminary analysis based on the current meter data obtained through an upgrade acceptance test of a single unit located in a mainstem plant has been presented.« less
Perceived job insecurity and worker health in the United States
Burgard, Sarah A.; Brand, Jennie E; House, James S
2009-01-01
Economic recessions, the industrial shift from manufacturing toward service industries, and rising global competition have contributed to uncertainty about job security, with potential consequences for workers’ health. To address limitations of prior research on the health consequences of perceived job insecurity, we use longitudinal data from two nationally-representative samples of the United States population, and examine episodic and persistent perceived job insecurity over periods of about three years to almost a decade. Results show that persistent perceived job insecurity is a significant and substantively important predictor of poorer self-rated health in the American’s Changing Lives (ACL) and Midlife in the United States (MIDUS) samples, and of depressive symptoms among ACL respondents. Job losses or unemployment episodes are associated with perceived job insecurity, but do not account for its association with health. Results are robust to controls for sociodemographic and job characteristics, negative reporting style, and earlier health and health behaviors. PMID:19596166
Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2016-01-01
Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process. PMID:26821027
Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2016-01-26
Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor's uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.
Jacobson, Robert B.
2006-01-01
Extensive efforts are underway along the Lower Missouri River to rehabilitate ecosystem functions in the channel and flood plain. Considerable uncertainty inevitably accompanies ecosystem restoration efforts, indicating the benefits of an adaptive management approach in which management actions are treated as experiments, and results provide information to feed back into the management process. The Overton Bottoms North Unit of the Big Muddy National Fish and Wildlife Refuge is a part of the Missouri River Fish and Wildlife Habitat Mitigation Project. The dominant management action at the Overton Bottoms North Unit has been excavation of a side-channel chute to increase hydrologic connectivity and to enhance shallow, slow current-velocity habitat. The side-channel chute also promises to increase hydrologic gradients, and may serve to alter patterns of wetland inundation and vegetation community growth in undesired ways. The U.S. Geological Survey's Central Region Integrated Studies Program (CRISP) undertook interdisciplinary research at the Overton Bottoms North Unit in 2003 to address key areas of scientific uncertainty that were highly relevant to ongoing adaptive management of the site, and to the design of similar rehabilitation projects on the Lower Missouri River. This volume presents chapters documenting the surficial geologic, topographic, surface-water, and ground-water framework of the Overton Bottoms North Unit. Retrospective analysis of vegetation community trends over the last 10 years is used to evaluate vegetation responses to reconnection of the Overton Bottoms North Unit to the river channel. Quasi-experimental analysis of cottonwood growth rate variation along hydrologic gradients is used to evaluate sensitivity of terrestrial vegetation to development of aquatic habitats. The integrated, landscape-specific understanding derived from these studies illustrates the value of scientific information in design and management of rehabilitation projects.
Uncertainty in gridded CO 2 emissions estimates
Hogue, Susannah; Marland, Eric; Andres, Robert J.; ...
2016-05-19
We are interested in the spatial distribution of fossil-fuel-related emissions of CO 2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO 2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from themore » use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. In conclusion, uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.« less
A review of uncertainty research in impact assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca
2015-01-15
This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We identified three main themes of uncertainty research in 134 papers from the scholarly literature. • The majority of research has focused on better methods for managing uncertainty in predictions. • Uncertainty disclosure is demanded of practitioners, but there is little guidance on how to do so. • There is limited theoretical explanation as to why uncertainty is avoided or not disclosed. • Conceptual, practical and theoretical guidance are required for IA uncertainty consideration.« less
42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false How will NIOSH address uncertainty about dose levels? 82.19 Section 82.19 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER...
He, L; Huang, G H; Lu, H W
2010-04-15
Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bechtel Nevada
2005-09-01
A new, revised three-dimensional (3-D) hydrostratigraphic framework model for Frenchman Flat was completed in 2004. The area of interest includes Frenchman Flat, a former nuclear testing area at the Nevada Test Site, and proximal areas. Internal and external reviews of an earlier (Phase I) Frenchman Flat model recommended additional data collection to address uncertainties. Subsequently, additional data were collected for this Phase II initiative, including five new drill holes and a 3-D seismic survey.
Severtson, Dolores J.
2015-01-01
Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings. PMID:26412960
Severtson, Dolores J
2015-02-01
Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings.
Uncertainty after treatment for prostate cancer: definition, assessment, and management.
Yu Ko, Wellam F; Degner, Lesley F
2008-10-01
Prostate cancer is the second most common type of cancer in men living in the United States and the most common type of malignancy in Canadian men, accounting for 186,320 new cases in the United States and 24,700 in Canada in 2008. Uncertainty, a component of all illness experiences, influences how men perceive the processes of treatment and adaptation. The Reconceptualized Uncertainty in Illness Theory explains the chronic nature of uncertainty in cancer survivorship by describing a shift from an emergent acute phase of uncertainty in survivors to a new level of uncertainty that is no longer acute and becomes a part of daily life. Proper assessment of certainty and uncertainty may allow nurses to maximize the effectiveness of patient-provider communication, cognitive reframing, and problem-solving interventions to reduce uncertainty after cancer treatment.
Integrating Solar PV in Utility System Operations: Analytical Framework and Arizona Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jing; Botterud, Audun; Mills, Andrew
2015-06-01
A systematic framework is proposed to estimate the impact on operating costs due to uncertainty and variability in renewable resources. The framework quantifies the integration costs associated with subhourly variability and uncertainty as well as day-ahead forecasting errors in solar PV (photovoltaics) power. A case study illustrates how changes in system operations may affect these costs for a utility in the southwestern United States (Arizona Public Service Company). We conduct an extensive sensitivity analysis under different assumptions about balancing reserves, system flexibility, fuel prices, and forecasting errors. We find that high solar PV penetrations may lead to operational challenges, particularlymore » during low-load and high solar periods. Increased system flexibility is essential for minimizing integration costs and maintaining reliability. In a set of sensitivity cases where such flexibility is provided, in part, by flexible operations of nuclear power plants, the estimated integration costs vary between $1.0 and $4.4/MWh-PV for a PV penetration level of 17%. The integration costs are primarily due to higher needs for hour-ahead balancing reserves to address the increased sub-hourly variability and uncertainty in the PV resource. (C) 2015 Elsevier Ltd. All rights reserved.« less
Assessment of Uncertainty-Infused Scientific Argumentation
ERIC Educational Resources Information Center
Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.
2014-01-01
Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…
NASA Astrophysics Data System (ADS)
Newman, A. J.; Clark, M. P.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Mizukami, N.; Longman, R. J.; Giambelluca, T. W.; Cherry, J.; Nowak, K.; Arnold, J.; Prein, A. F.
2016-12-01
Gridded precipitation and temperature products are inherently uncertain due to myriad factors. These include interpolation from a sparse observation network, measurement representativeness, and measurement errors. Despite this inherent uncertainty, uncertainty is typically not included, or is a specific addition to each dataset without much general applicability across different datasets. A lack of quantitative uncertainty estimates for hydrometeorological forcing fields limits their utility to support land surface and hydrologic modeling techniques such as data assimilation, probabilistic forecasting and verification. To address this gap, we have developed a first of its kind gridded, observation-based ensemble of precipitation and temperature at a daily increment for the period 1980-2012 over the United States (including Alaska and Hawaii). A longer, higher resolution version (1970-present, 1/16th degree) has also been implemented to support real-time hydrologic- monitoring and prediction in several regional US domains. We will present the development and evaluation of the dataset, along with initial applications of the dataset for ensemble data assimilation and probabilistic evaluation of high resolution regional climate model simulations. We will also present results on the new high resolution products for Alaska and Hawaii (2 km and 250 m respectively), to complete the first ensemble observation based product suite for the entire 50 states. Finally, we will present plans to improve the ensemble dataset, focusing on efforts to improve the methods used for station interpolation and ensemble generation, as well as methods to fuse station data with numerical weather prediction model output.
Simic, Vladimir
2016-06-01
As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.
Guo, Yang; Tian, Jinping; Chertow, Marian; Chen, Lujun
2016-10-03
Mitigating greenhouse gas (GHG) emissions in China's industrial sector is crucial for addressing climate change. We developed a vintage stock model to quantify the GHG mitigation potential and cost effectiveness in Chinese eco-industrial parks by targeting energy infrastructure with five key measures. The model, integrating energy efficiency assessments, GHG emission accounting, cost-effectiveness analyses, and scenario analyses, was applied to 548 units of energy infrastructure in 106 parks. The results indicate that two measures (shifting coal-fired boilers to natural gas-fired boilers and replacing coal-fired units with natural gas combined cycle units) present a substantial potential to mitigate GHGs (42%-46%) compared with the baseline scenario. The other three measures (installation of municipal solid waste-to-energy units, replacement of small-capacity coal-fired units with large units, and implementation of turbine retrofitting) present potential mitigation values of 6.7%, 0.3%, and 2.1%, respectively. In most cases, substantial economic benefits also can be achieved by GHG emission mitigation. An uncertainty analysis showed that enhancing the annual working time or serviceable lifetime levels could strengthen the GHG mitigation potential at a lower cost for all of the measures.
DeWeber, Jefferson T; Wagner, Tyler
2018-06-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our study demonstrates that even relatively small differences in the definitions of climate metrics can result in very different projections and reveal high uncertainty in predicted climate change effects. © 2018 John Wiley & Sons Ltd.
DeWeber, Jefferson T.; Wagner, Tyler
2018-01-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our study demonstrates that even relatively small differences in the definitions of climate metrics can result in very different projections and reveal high uncertainty in predicted climate change effects.
Climate change, extreme weather events, and us health impacts: what can we say?
Mills, David M
2009-01-01
Address how climate change impacts on a group of extreme weather events could affect US public health. A literature review summarizes arguments for, and evidence of, a climate change signal in select extreme weather event categories, projections for future events, and potential trends in adaptive capacity and vulnerability in the United States. Western US wildfires already exhibit a climate change signal. The variability within hurricane and extreme precipitation/flood data complicates identifying a similar climate change signal. Health impacts of extreme events are not equally distributed and are very sensitive to a subset of exceptional extreme events. Cumulative uncertainty in forecasting climate change driven characteristics of extreme events and adaptation prevents confidently projecting the future health impacts from hurricanes, wildfires, and extreme precipitation/floods in the United States attributable to climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, Mitchell T.
Although the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limited full scale prototypic data.more » Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings, Incorporated (TEPCO Holdings) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document, which has been updated to include FY2017 information, summarizes results from U.S. efforts to use information obtained by TEPCO Holdings to enhance the safety of existing and future nuclear power plant designs. This effort, which was initiated in 2014 by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of U.S. experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO Holdings information from Daiichi that address these needs. Each year, annual reports include examples demonstrating that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities. Highlights in this FY2017 report include new insights with respect to the forces required to produce the observed Daiichi Unit 1 (1F1) shield plug endstate, the observed leakage from 1F1 components, and the amount of combustible gas generation required to produce the observed explosions in Daiichi Units 3 and 4 (1F3 and 1F4). This report contains an appendix with a list of examination needs that was updated after U.S. experts reviewed recently obtained information from examinations at Daiichi. Additional details for higher priority, near-term, examination activities are also provided. This report also includes an appendix with a description of an updated website that has been reformatted to better assist U.S. experts by providing information in an archived retrievable location, as well as an appendix summarizing U.S. Forensics activities to host the TMI-2 Knowledge Transfer and Relevance to Fukushima Meeting that was held in Idaho Falls, ID, on October 10-14, 2016.« less
Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing
NASA Astrophysics Data System (ADS)
Datta, D.
2010-10-01
Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.
Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model
USDA-ARS?s Scientific Manuscript database
Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...
To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.
Incorporating nurse absenteeism into staffing with demand uncertainty.
Maass, Kayse Lee; Liu, Boying; Daskin, Mark S; Duck, Mary; Wang, Zhehui; Mwenesi, Rama; Schapiro, Hannah
2017-03-01
Increased nurse-to-patient ratios are associated negatively with increased costs and positively with improved patient care and reduced nurse burnout rates. Thus, it is critical from a cost, patient safety, and nurse satisfaction perspective that nurses be utilized efficiently and effectively. To address this, we propose a stochastic programming formulation for nurse staffing that accounts for variability in the patient census and nurse absenteeism, day-to-day correlations among the patient census levels, and costs associated with three different classes of nursing personnel: unit, pool, and temporary nurses. The decisions to be made include: how many unit nurses to employ, how large a pool of cross-trained nurses to maintain, how to allocate the pool nurses on a daily basis, and how many temporary nurses to utilize daily. A genetic algorithm is developed to solve the resulting model. Preliminary results using data from a large university hospital suggest that the proposed model can save a four-unit pool hundreds of thousands of dollars annually as opposed to the crude heuristics the hospital currently employs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marutzky, Sam J.; Andrews, Robert
The peer review team commends the Navarro-Intera, LLC (N-I), team for its efforts in using limited data to model the fate of radionuclides in groundwater at Yucca Flat. Recognizing the key uncertainties and related recommendations discussed in Section 6.0 of this report, the peer review team has concluded that U.S. Department of Energy (DOE) is ready for a transition to model evaluation studies in the corrective action decision document (CADD)/corrective action plan (CAP) stage. The DOE, National Nuclear Security Administration Nevada Field Office (NNSA/NFO) clarified the charge to the peer review team in a letter dated October 9, 2014, frommore » Bill R. Wilborn, NNSA/NFO Underground Test Area (UGTA) Activity Lead, to Sam J. Marutzky, N-I UGTA Project Manager: “The model and supporting information should be sufficiently complete that the key uncertainties can be adequately identified such that they can be addressed by appropriate model evaluation studies. The model evaluation studies may include data collection and model refinements conducted during the CADD/CAP stage. One major input to identifying ‘key uncertainties’ is the detailed peer review provided by independent qualified peers.” The key uncertainties that the peer review team recognized and potential concerns associated with each are outlined in Section 6.0, along with recommendations corresponding to each uncertainty. The uncertainties, concerns, and recommendations are summarized in Table ES-1. The number associated with each concern refers to the section in this report where the concern is discussed in detail.« less
Sensitivity and uncertainty analysis for the annual phosphorus loss estimator model
USDA-ARS?s Scientific Manuscript database
Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...
A framework for modeling uncertainty in regional climate change
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...
Faris, A M; Wang, H-H; Tarone, A M; Grant, W E
2016-05-31
Estimates of insect age can be informative in death investigations and, when certain assumptions are met, can be useful for estimating the postmortem interval (PMI). Currently, the accuracy and precision of PMI estimates is unknown, as error can arise from sources of variation such as measurement error, environmental variation, or genetic variation. Ecological models are an abstract, mathematical representation of an ecological system that can make predictions about the dynamics of the real system. To quantify the variation associated with the pre-appearance interval (PAI), we developed an ecological model that simulates the colonization of vertebrate remains by Cochliomyia macellaria (Fabricius) (Diptera: Calliphoridae), a primary colonizer in the southern United States. The model is based on a development data set derived from a local population and represents the uncertainty in local temperature variability to address PMI estimates at local sites. After a PMI estimate is calculated for each individual, the model calculates the maximum, minimum, and mean PMI, as well as the range and standard deviation for stadia collected. The model framework presented here is one manner by which errors in PMI estimates can be addressed in court when no empirical data are available for the parameter of interest. We show that PAI is a potential important source of error and that an ecological model is one way to evaluate its impact. Such models can be re-parameterized with any development data set, PAI function, temperature regime, assumption of interest, etc., to estimate PMI and quantify uncertainty that arises from specific prediction systems. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Autonomous frequency domain identification: Theory and experiment
NASA Technical Reports Server (NTRS)
Yam, Yeung; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.; Scheid, R. E.
1989-01-01
The analysis, design, and on-orbit tuning of robust controllers require more information about the plant than simply a nominal estimate of the plant transfer function. Information is also required concerning the uncertainty in the nominal estimate, or more generally, the identification of a model set within which the true plant is known to lie. The identification methodology that was developed and experimentally demonstrated makes use of a simple but useful characterization of the model uncertainty based on the output error. This is a characterization of the additive uncertainty in the plant model, which has found considerable use in many robust control analysis and synthesis techniques. The identification process is initiated by a stochastic input u which is applied to the plant p giving rise to the output. Spectral estimation (h = P sub uy/P sub uu) is used as an estimate of p and the model order is estimated using the produce moment matrix (PMM) method. A parametric model unit direction vector p is then determined by curve fitting the spectral estimate to a rational transfer function. The additive uncertainty delta sub m = p - unit direction vector p is then estimated by the cross spectral estimate delta = P sub ue/P sub uu where e = y - unit direction vectory y is the output error, and unit direction vector y = unit direction vector pu is the computed output of the parametric model subjected to the actual input u. The experimental results demonstrate the curve fitting algorithm produces the reduced-order plant model which minimizes the additive uncertainty. The nominal transfer function estimate unit direction vector p and the estimate delta of the additive uncertainty delta sub m are subsequently available to be used for optimization of robust controller performance and stability.
Measuring, Estimating, and Deciding under Uncertainty.
Michel, Rolf
2016-03-01
The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. Copyright © 2015 Elsevier Ltd. All rights reserved.
Carcioppolo, Nick; Yang, Fan; Yang, Qinghua
2016-09-01
Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.
NASA Technical Reports Server (NTRS)
Breininger, David; Duncan, Brean; Eaton, Mitchell; Johnson, Fred; Nichols, James
2014-01-01
Land cover modeling is used to inform land management, but most often via a two-step process where science informs how management alternatives can influence resources and then decision makers can use this to make decisions. A more efficient process is to directly integrate science and decision making, where science allows us to learn to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuels monitoring with decision making focused on dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy, but habitat trajectories suggest tradeoffs. Knowledge about system responses to actions can be informed by applying competing management actions to different land units in the same system state and by ideas about fire behavior. Monitoring and management integration is important to optimize state-specific management decisions and increase knowledge about system responses. We believe this approach has broad utility for and cover modeling programs intended to inform decision making.
Fuzzy logic path planning system for collision avoidance by an autonomous rover vehicle
NASA Technical Reports Server (NTRS)
Murphy, Michael G.
1993-01-01
The Space Exploration Initiative of the United States will make great demands upon NASA and its limited resources. One aspect of great importance will be providing for autonomous (unmanned) operation of vehicles and/or subsystems in space flight and surface exploration. An additional, complicating factor is that much of the need for autonomy of operation will take place under conditions of great uncertainty or ambiguity. Issues in developing an autonomous collision avoidance subsystem within a path planning system for application in a remote, hostile environment that does not lend itself well to remote manipulation by Earth-based telecommunications is addressed. A good focus is unmanned surface exploration of Mars. The uncertainties involved indicate that robust approaches such as fuzzy logic control are particularly appropriate. Four major issues addressed are (1) avoidance of a fuzzy moving obstacle; (2) backoff from a deadend in a static obstacle environment; (3) fusion of sensor data to detect obstacles; and (4) options for adaptive learning in a path planning system. Examples of the need for collision avoidance by an autonomous rover vehicle on the surface of Mars with a moving obstacle would be wind-blown debris, surface flow or anomalies due to subsurface disturbances, another vehicle, etc. The other issues of backoff, sensor fusion, and adaptive learning are important in the overall path planning system.
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Schlamminger, Stephan; Stoudt, Sara; Pratt, Jon R.; Williams, Carl J.
2018-02-01
The Consultative Committee for Mass and related quantities (CCM), of the International Committee for weights and measures (CIPM), has recently declared the readiness of the community to support the redefinition of the international system of units (SI) at the next meeting of the General Conference on Weights and Measures (CGPM) scheduled for November, 2018. Such redefinition will replace the international prototype of the Kilogram (IPK), as the definition and sole primary realization of the unit of mass, with a definition involving the Planck constant, h. This redefinition in terms of a fundamental constant of nature will enable widespread primary realizations not only of the kilogram but also of its multiples and sub-multiples, best to address the full range of practical needs in the measurement of mass. We review and discuss the statistical models and statistical data reductions, uncertainty evaluations, and substantive arguments that support the verification of several technical preconditions for the redefinition that the CCM has established, and whose verification the CCM has affirmed. These conditions relate to the accuracy and mutual consistency of qualifying measurement results. We review also an issue that has surfaced only recently, concerning the convergence toward a stable value, of the historical values that the task group on fundamental constants of the committee on Data for Science and Technology CODATA-TGFC has recommended for h over the years, even though the CCM has not deemed this issue to be relevant. We conclude that no statistically significant trend can be substantiated for these recommended values, but note that cumulative consensus values that may be derived from the historical measurement results for h seem to have converged while continuing to exhibit fluctuations that are typical of a process in statistical control. Finally, we argue that the most recent consensus value derived from the best measurements available for h, obtained using either a Kibble balance or the XRCD method, is reliable and has uncertainty no larger than the uncertainties surrounding the current primary and secondary realizations of the unit of mass, hence that no credible technical impediments stand in the way of the redefinition of the unit of mass in terms of a fixed value of h.
Optimal infrastructure maintenance scheduling problem under budget uncertainty.
DOT National Transportation Integrated Search
2010-05-01
This research addresses a general class of infrastructure asset management problems. Infrastructure : agencies usually face budget uncertainties that will eventually lead to suboptimal planning if : maintenance decisions are made without taking the u...
Isendahl, Nicola; Dewulf, Art; Pahl-Wostl, Claudia
2010-01-01
By now, the need for addressing uncertainty in the management of water resources is widely recognized, yet there is little expertise and experience how to effectively deal with uncertainty in practice. Uncertainties in water management practice so far are mostly dealt with intuitively or based on experience. That way decisions can be quickly taken but analytic processes of deliberate reasoning are bypassed. To meet the desire of practitioners for better guidance and tools how to deal with uncertainty more practice-oriented systematic approaches are needed. For that purpose we consider it important to understand how practitioners frame uncertainties. In this paper we present an approach where water managers developed criteria of relevance to understand and address uncertainties. The empirical research took place in the Doñana region of the Guadalquivir estuary in southern Spain making use of the method of card sorting. Through the card sorting exercise a broad range of criteria to make sense of and describe uncertainties was produced by different subgroups, which were then merged into a shared list of criteria. That way framing differences were made explicit and communication on uncertainty and on framing differences was enhanced. In that, the present approach constitutes a first step to enabling reframing and overcoming framing differences, which are important features on the way to robust decision-making. Moreover, the elaborated criteria build a basis for the development of more structured approaches to deal with uncertainties in water management practice. Copyright 2009 Elsevier Ltd. All rights reserved.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Steering the measured uncertainty under decoherence through local PT -symmetric operations
NASA Astrophysics Data System (ADS)
Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu
2018-07-01
The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campos, E; Sisterson, Douglas
The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty: 1) identifying how the uncertainty of individual ARM measurements is currently expressed, 2) identifying a consistent approach to measurement uncertainty, and then 3) reclassifying ARM instrument measurement uncertainties in a common framework.« less
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Dogfish Monitoring Committee shall identify and review the relevant sources of management uncertainty to... management uncertainty that were considered, technical approaches to mitigating these sources of uncertainty..., DEPARTMENT OF COMMERCE FISHERIES OF THE NORTHEASTERN UNITED STATES Management Measures for the Spiny Dogfish...
Fear and Stigma: The Epidemic within the SARS Outbreak
Sy, Francisco; Holton, Kelly; Govert, Barbara; Liang, Arthur; Garza, Brenda; Gould, Deborah; Hickson, Meredith; McDonald, Marian; Meijer, Cecilia; Smith, Julia; Veto, Liza; Williams, Walter; Zauderer, Laura
2004-01-01
Because of their evolving nature and inherent scientific uncertainties, outbreaks of emerging infectious diseases can be associated with considerable fear in the general public or in specific communities, especially when illness and deaths are substantial. Mitigating fear and discrimination directed toward persons infected with, and affected by, infectious disease can be important in controlling transmission. Persons who are feared and stigmatized may delay seeking care and remain in the community undetected. This article outlines efforts to rapidly assess, monitor, and address fears associated with the 2003 severe acute respiratory syndrome (SARS) epidemic in the United States. Although fear, stigmatization, and discrimination were not widespread in the general public, Asian-American communities were particularly affected. PMID:15030713
The U.S. Geological Survey Monthly Water Balance Model Futures Portal
Bock, Andy
2017-03-16
Simulations of future climate suggest profiles of temperature and precipitation may differ significantly from those in the past. These changes in climate will likely lead to changes in the hydrologic cycle. As such, natural resource managers are in need of tools that can provide estimates of key components of the hydrologic cycle, uncertainty associated with the estimates, and limitations associated with the climate forcing data used to estimate these components. To help address this need, the U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) provides a user friendly interface to deliver hydrologic and meteorological variables for monthly historic and potential future climatic conditions across the continental United States.
Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,
2014-01-01
Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.
Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty
Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon
2006-01-01
Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions adopted in the loss calculations. This is a sensitivity study aimed at future regional earthquake source modelers, so that they may be informed of the effects on loss introduced by modeling assumptions and epistemic uncertainty in the WG02 earthquake source model.
On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification
NASA Technical Reports Server (NTRS)
Lim, K. B.; Giesy, D. P.
1997-01-01
In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.
NASA Technical Reports Server (NTRS)
Radespiel, Rolf; Hemsch, Michael J.
2007-01-01
The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.
NASA Astrophysics Data System (ADS)
Griffin, Patrick; Rochman, Dimitri; Koning, Arjan
2017-09-01
A rigorous treatment of the uncertainty in the underlying nuclear data on silicon displacement damage metrics is presented. The uncertainty in the cross sections and recoil atom spectra are propagated into the energy-dependent uncertainty contribution in the silicon displacement kerma and damage energy using a Total Monte Carlo treatment. An energy-dependent covariance matrix is used to characterize the resulting uncertainty. A strong correlation between different reaction channels is observed in the high energy neutron contributions to the displacement damage metrics which supports the necessity of using a Monte Carlo based method to address the nonlinear nature of the uncertainty propagation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radulescu, Georgeta; Gauld, Ian C; Ilas, Germina
2011-01-01
The expanded use of burnup credit in the United States (U.S.) for storage and transport casks, particularly in the acceptance of credit for fission products, has been constrained by the availability of experimental fission product data to support code validation. The U.S. Nuclear Regulatory Commission (NRC) staff has noted that the rationale for restricting the Interim Staff Guidance on burnup credit for storage and transportation casks (ISG-8) to actinide-only is based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address themore » issues of burnup credit criticality validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the isotopic composition (depletion) validation approach and resulting observations and recommendations. Validation of the criticality calculations is addressed in a companion paper at this conference. For isotopic composition validation, the approach is to determine burnup-dependent bias and uncertainty in the effective neutron multiplication factor (keff) due to bias and uncertainty in isotopic predictions, via comparisons of isotopic composition predictions (calculated) and measured isotopic compositions from destructive radiochemical assay utilizing as much assay data as is available, and a best-estimate Monte Carlo based method. This paper (1) provides a detailed description of the burnup credit isotopic validation approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias and uncertainty results based on a quality-assurance-controlled prerelease version of the Scale 6.1 code package and the ENDF/B-VII nuclear cross section data.« less
Estimating uncertainties in complex joint inverse problems
NASA Astrophysics Data System (ADS)
Afonso, Juan Carlos
2016-04-01
Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related to the forward and statistical models, I will also address other uncertainties associated with data and uncertainty propagation.
State Tracking and Fault Diagnosis for Dynamic Systems Using Labeled Uncertainty Graph.
Zhou, Gan; Feng, Wenquan; Zhao, Qi; Zhao, Hongbo
2015-11-05
Cyber-physical systems such as autonomous spacecraft, power plants and automotive systems become more vulnerable to unanticipated failures as their complexity increases. Accurate tracking of system dynamics and fault diagnosis are essential. This paper presents an efficient state estimation method for dynamic systems modeled as concurrent probabilistic automata. First, the Labeled Uncertainty Graph (LUG) method in the planning domain is introduced to describe the state tracking and fault diagnosis processes. Because the system model is probabilistic, the Monte Carlo technique is employed to sample the probability distribution of belief states. In addition, to address the sample impoverishment problem, an innovative look-ahead technique is proposed to recursively generate most likely belief states without exhaustively checking all possible successor modes. The overall algorithms incorporate two major steps: a roll-forward process that estimates system state and identifies faults, and a roll-backward process that analyzes possible system trajectories once the faults have been detected. We demonstrate the effectiveness of this approach by applying it to a real world domain: the power supply control unit of a spacecraft.
Introducing Blended Learning: An Experience of Uncertainty for Students in the United Arab Emirates
ERIC Educational Resources Information Center
Kemp, Linzi J.
2013-01-01
The cultural dimension of Uncertainty Avoidance is analysed in this study of an introduction to blended learning for international students. Content analysis was conducted on the survey narratives collected from three cohorts of management undergraduates in the United Arab Emirates. Interpretation of certainty with blended learning was found in:…
Capturing Uncertainty in Fatigue Life Data
2014-09-18
CAPTURING UNCERTAINTY IN FATIGUE LIFE DATA THESIS Brent D. Russell AFIT-ENS-T-14-S-15 DEPARTMENT OF THE AIR FORCE AIR...UNLIMITED. i The views expressed in this thesis are those of the author and do not...reflect the official policy or position of the United States Air Force, Department of Defense, or the United States Government
Modeling uncertainty: quicksand for water temperature modeling
Bartholow, John M.
2003-01-01
Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.
Addressing location uncertainties in GPS-based activity monitoring: A methodological framework
Wan, Neng; Lin, Ge; Wilson, Gaines J.
2016-01-01
Location uncertainty has been a major barrier in information mining from location data. Although the development of electronic and telecommunication equipment has led to an increased amount and refined resolution of data about individuals’ spatio-temporal trajectories, the potential of such data, especially in the context of environmental health studies, has not been fully realized due to the lack of methodology that addresses location uncertainties. This article describes a methodological framework for deriving information about people’s continuous activities from individual-collected Global Positioning System (GPS) data, which is vital for a variety of environmental health studies. This framework is composed of two major methods that address critical issues at different stages of GPS data processing: (1) a fuzzy classification method for distinguishing activity patterns; and (2) a scale-adaptive method for refining activity locations and outdoor/indoor environments. Evaluation of this framework based on smartphone-collected GPS data indicates that it is robust to location errors and is able to generate useful information about individuals’ life trajectories. PMID:28943777
Identifying and assessing critical uncertainty thresholds in a forest pest risk model
Frank H. Koch; Denys Yemshanov
2015-01-01
Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk modelâs numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...
Final Technical Report: Advanced Measurement and Analysis of PV Derate Factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, Bruce Hardison; Burton, Patrick D.; Hansen, Clifford
2015-12-01
The Advanced Measurement and Analysis of PV Derate Factors project focuses on improving the accuracy and reducing the uncertainty of PV performance model predictions by addressing a common element of all PV performance models referred to as “derates”. Widespread use of “rules of thumb”, combined with significant uncertainty regarding appropriate values for these factors contribute to uncertainty in projected energy production.
Uncertainty in Early Occupational Aspirations: Role Exploration or Aimlessness?
ERIC Educational Resources Information Center
Staff, Jeremy; Harris, Angel; Sabates, Ricardo; Briddell, Laine
2010-01-01
Many youth in the United States lack clear occupational aspirations. This uncertainty in achievement ambitions may benefit socio-economic attainment if it signifies "role exploration," characterized by career development, continued education and enduring partnerships. By contrast, uncertainty may diminish attainment if it instead leads…
Climate change impacts on extreme events in the United States: an uncertainty analysis
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...
Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.
2015-01-01
This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108
Breininger, David; Duncan, Brean; Eaton, Mitchell J.; Johnson, Fred; Nichols, James
2014-01-01
Land cover modeling is used to inform land management, but most often via a two-step process, where science informs how management alternatives can influence resources, and then, decision makers can use this information to make decisions. A more efficient process is to directly integrate science and decision-making, where science allows us to learn in order to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by the specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuel monitoring with decision-making focused on the dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy; other conditions require tradeoffs between objectives. Knowledge about system responses to actions can be informed by developing hypotheses based on ideas about fire behavior and then applying competing management actions to different land units in the same system state. Monitoring and management integration is important to optimize state-specific management decisions and to increase knowledge about system responses. We believe this approach has broad utility and identifies a clear role for land cover modeling programs intended to inform decision-making.
Earth-Science Research for Addressing the Water-Energy Nexus
NASA Astrophysics Data System (ADS)
Healy, R. W.; Alley, W. M.; Engle, M.; McMahon, P. B.; Bales, J. D.
2013-12-01
In the coming decades, the United States will face two significant and sometimes competing challenges: preserving sustainable supplies of fresh water for humans and ecosystems, and ensuring available sources of energy. This presentation provides an overview of the earth-science data collection and research needed to address these challenges. Uncertainty limits our understanding of many aspects of the water-energy nexus. These aspects include availability of water, water requirements for energy development, energy requirements for treating and delivering fresh water, effects of emerging energy development technologies on water quality and quantity, and effects of future climates and land use on water and energy needs. Uncertainties can be reduced with an integrated approach that includes assessments of water availability and energy resources; monitoring of surface water and groundwater quantity and quality, water use, and energy use; research on impacts of energy waste streams, hydraulic fracturing, and other fuel-extraction processes on water quality; and research on the viability and environmental footprint of new technologies such as carbon capture and sequestration and conversion of cellulosic material to ethanol. Planning for water and energy development requires consideration of factors such as economics, population trends, human health, and societal values; however, sound resource management must be grounded on a clear understanding of the earth-science aspects of the water-energy nexus. Information gained from an earth-science data-collection and research program can improve our understanding of water and energy issues and lay the ground work for informed resource management.
Olea, R.A.; Luppens, J.A.; Tewalt, S.J.
2011-01-01
A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.
Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma
2010-01-01
In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.
Egger, C; Maurer, M
2015-04-15
Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Kalinich, Donald A.
2014-02-01
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less
Using high-throughput literature mining to support read-across predictions of toxicity (SOT)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
High-throughput literature mining to support read-across predictions of toxicity (ASCCT meeting)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
Field variability and vulnerability index to identify precision agriculture opportunity
USDA-ARS?s Scientific Manuscript database
Innovations in precision agriculture (PA) have created opportunities to achieve a greater understanding of within-field variability. However, PA adoption has been hindered due to uncertainty about field-specific performance and return on investment. Uncertainty could be better addressed by analyzing...
Quantifying uncertainty in read-across assessment – an algorithmic approach - (SOT)
Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...
Accurate Radiometry from Space: An Essential Tool for Climate Studies
NASA Technical Reports Server (NTRS)
Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma
2011-01-01
The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance
NIST Stars: Absolute Spectrophotometric Calibration of Vega and Sirius
NASA Astrophysics Data System (ADS)
Deustua, Susana; Woodward, John T.; Rice, Joseph P.; Brown, Steven W.; Maxwell, Stephen E.; Alberding, Brian G.; Lykke, Keith R.
2018-01-01
Absolute flux calibration of standard stars, traceable to SI (International System of Units) standards, is essential for 21st century astrophysics. Dark energy investigations that rely on observations of Type Ia supernovae and precise photometric redshifts of weakly lensed galaxies require a minimum accuracy of 0.5 % in the absolute color calibration. Studies that aim to address fundamental stellar astrophysics also benefit. In the era of large telescopes and all sky surveys well-calibrated standard stars that do not saturate and that are available over the whole sky are needed. Significant effort has been expended to obtain absolute measurements of the fundamental standards Vega and Sirius (and other stars) in the visible and near infrared, achieving total uncertainties between1% and 3%, depending on wavelength, that do not meet the needed accuracy. The NIST Stars program aims to determine the top-of-the-atmosphere absolute spectral irradiance of bright stars to an uncertainty less than 1% from a ground-based observatory. NIST Stars has developed a novel, fully SI-traceable laboratory calibration strategy that will enable achieving the desired accuracy. This strategy has two key components. The first is the SI-traceable calibration of the entire instrument system, and the second is the repeated spectroscopic measurement of the target star throughout the night. We will describe our experimental strategy, present preliminary results for Vega and Sirius and an end-to-end uncertainty budget
NASA Astrophysics Data System (ADS)
Feyen, Luc; Caers, Jef
2006-06-01
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.
RADIONUCLIDE INVENTORY AND DISTRIBUTION: FOURMILE BRANCH, PEN BRANCH, AND STEEL CREEK IOUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiergesell, R.; Phifer, M.
2014-04-29
As a condition to the Department of Energy (DOE) Low Level Waste Disposal Federal Facility Review Group (LFRG) review team approving the Savannah River Site (SRS) Composite Analysis (CA), SRS agreed to follow up on a secondary issue, which consisted of the consolidation of several observations that the team concluded, when evaluated collectively, could potentially impact the integration of the CA results. This report addresses secondary issue observations 4 and 21, which identify the need to improve the CA sensitivity and uncertainty analysis specifically by improving the CA inventory and the estimate of its uncertainty. The purpose of the workmore » described herein was to be responsive to these secondary issue observations by re-examining the radionuclide inventories of the Integrator Operable Units (IOUs), as documented in ERD 2001 and Hiergesell, et. al. 2008. The LFRG concern has been partially addressed already for the Lower Three Runs (LTR) IOU (Hiergesell and Phifer, 2012). The work described in this investigation is a continuation of the effort to address the LFRG concerns by re-examining the radionuclide inventories associated with Fourmile Branch (FMB) IOU, Pen Branch (PB) IOU and Steel Creek (SC) IOU. The overall approach to computing radionuclide inventories for each of the IOUs involved the following components: • Defining contaminated reaches of sediments along the IOU waterways • Identifying separate segments within each IOU waterway to evaluate individually • Computing the volume and mass of contaminated soil associated with each segment, or “compartment” • Obtaining the available and appropriate Sediment and Sediment/Soil analytical results associated with each IOU • Standardizing all radionuclide activity by decay-correcting all sample analytical results from sample date to the current point in time, • Computing representative concentrations for all radionuclides associated with each compartment in each of the IOUs • Computing the radionuclide inventory of each DOE-added radionuclide for the compartments of each IOU by applying the representative, central value concentration to the mass of contaminated soil • Totaling the inventory for all compartments associated with each of the IOUs Using this approach the 2013 radionuclide inventories for each sub-compartment associated with each of the three IOUs were computed, by radionuclide. The inventories from all IOU compartments were then rolled-up into a total inventory for each IOU. To put the computed estimate of radionuclide activities within FMB, PB, and SC IOUs into context, attention was drawn to Cs-137, which was the radionuclide with the largest contributor to the calculated dose to a member of the public at the perimeter of SRS within the 2010 SRS CA (SRNL 2010). The total Cs-137 activity in each of the IOUs was calculated to be 9.13, 1.5, and 17.4 Ci for FMB, PB, and SC IOUs, respectively. Another objective of this investigation was to address the degree of uncertainty associated with the estimated residual radionuclide activity that is calculated for the FMB, PB, and SC IOUs. Two primary contributing factors to overall uncertainty of inventory estimates were identified and evaluated. The first related to the computation of the mass of contaminated material in a particular IOU compartment and the second to the uncertainty associated with analytical counting errors. The error ranges for the mass of contaminated material in each IOU compartment were all calculated to be approximately +/- 9.6%, or a nominal +/-10%. This nominal value was added to the uncertainty associated with the analytical counting errors that were associated with each radionuclide, individually. This total uncertainty was then used to calculate a maximum and minimum estimated radionuclide inventories for each IOU.« less
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert
1989-01-01
In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.
Stochastic Optimization For Water Resources Allocation
NASA Astrophysics Data System (ADS)
Yamout, G.; Hatfield, K.
2003-12-01
For more than 40 years, water resources allocation problems have been addressed using deterministic mathematical optimization. When data uncertainties exist, these methods could lead to solutions that are sub-optimal or even infeasible. While optimization models have been proposed for water resources decision-making under uncertainty, no attempts have been made to address the uncertainties in water allocation problems in an integrated approach. This paper presents an Integrated Dynamic, Multi-stage, Feedback-controlled, Linear, Stochastic, and Distributed parameter optimization approach to solve a problem of water resources allocation. It attempts to capture (1) the conflict caused by competing objectives, (2) environmental degradation produced by resource consumption, and finally (3) the uncertainty and risk generated by the inherently random nature of state and decision parameters involved in such a problem. A theoretical system is defined throughout its different elements. These elements consisting mainly of water resource components and end-users are described in terms of quantity, quality, and present and future associated risks and uncertainties. Models are identified, modified, and interfaced together to constitute an integrated water allocation optimization framework. This effort is a novel approach to confront the water allocation optimization problem while accounting for uncertainties associated with all its elements; thus resulting in a solution that correctly reflects the physical problem in hand.
NASA Astrophysics Data System (ADS)
Blum, David Arthur
Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..
Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...
Scaling Up Decision Theoretic Planning to Planetary Rover Problems
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Dearden, Richard; Washington, Rich
2004-01-01
Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.
Aktar, Evin; Nikolić, Milica; Bögels, Susan M.
2017-01-01
Generalized anxiety disorder (GAD) runs in families. Building on recent theoretical approaches, this review focuses on potential environmental pathways for parent-to-child transmission of GAD. First, we address child acquisition of a generalized pattern of fearful/anxious and avoidant responding to potential threat from parents via verbal information and via modeling. Next, we address how parenting behaviors may contribute to maintenance of fearful/anxious and avoidant reactions in children. Finally, we consider intergenerational transmission of worries as a way of coping with experiential avoidance of strong negative emotions and with intolerance of uncertainty. We conclude that parents with GAD may bias their children's processing of potential threats in the environment by conveying the message that the world is not safe, that uncertainty is intolerable, that strong emotions should be avoided, and that worry helps to cope with uncertainty, thereby transmitting cognitive styles that characterize GAD. Our review highlights the need for research on specific pathways for parent-to-child transmission of GAD. PMID:28867938
Aktar, Evin; Nikolić, Milica; Bögels, Susan M
2017-06-01
Generalized anxiety disorder (GAD) runs in families. Building on recent theoretical approaches, this review focuses on potential environmental pathways for parent-to-child transmission of GAD. First, we address child acquisition of a generalized pattern of fearful/anxious and avoidant responding to potential threat from parents via verbal information and via modeling. Next, we address how parenting behaviors may contribute to maintenance of fearful/anxious and avoidant reactions in children. Finally, we consider intergenerational transmission of worries as a way of coping with experiential avoidance of strong negative emotions and with intolerance of uncertainty. We conclude that parents with GAD may bias their children's processing of potential threats in the environment by conveying the message that the world is not safe, that uncertainty is intolerable, that strong emotions should be avoided, and that worry helps to cope with uncertainty, thereby transmitting cognitive styles that characterize GAD. Our review highlights the need for research on specific pathways for parent-to-child transmission of GAD.
Optimal Operation of Energy Storage in Power Transmission and Distribution
NASA Astrophysics Data System (ADS)
Akhavan Hejazi, Seyed Hossein
In this thesis, we investigate optimal operation of energy storage units in power transmission and distribution grids. At transmission level, we investigate the problem where an investor-owned independently-operated energy storage system seeks to offer energy and ancillary services in the day-ahead and real-time markets. We specifically consider the case where a significant portion of the power generated in the grid is from renewable energy resources and there exists significant uncertainty in system operation. In this regard, we formulate a stochastic programming framework to choose optimal energy and reserve bids for the storage units that takes into account the fluctuating nature of the market prices due to the randomness in the renewable power generation availability. At distribution level, we develop a comprehensive data set to model various stochastic factors on power distribution networks, with focus on networks that have high penetration of electric vehicle charging load and distributed renewable generation. Furthermore, we develop a data-driven stochastic model for energy storage operation at distribution level, where the distribution of nodal voltage and line power flow are modelled as stochastic functions of the energy storage unit's charge and discharge schedules. In particular, we develop new closed-form stochastic models for such key operational parameters in the system. Our approach is analytical and allows formulating tractable optimization problems. Yet, it does not involve any restricting assumption on the distribution of random parameters, hence, it results in accurate modeling of uncertainties. By considering the specific characteristics of random variables, such as their statistical dependencies and often irregularly-shaped probability distributions, we propose a non-parametric chance-constrained optimization approach to operate and plan energy storage units in power distribution girds. In the proposed stochastic optimization, we consider uncertainty from various elements, such as solar photovoltaic , electric vehicle chargers, and residential baseloads, in the form of discrete probability functions. In the last part of this thesis we address some other resources and concepts for enhancing the operation of power distribution and transmission systems. In particular, we proposed a new framework to determine the best sites, sizes, and optimal payment incentives under special contracts for committed-type DG projects to offset distribution network investment costs. In this framework, the aim is to allocate DGs such that the profit gained by the distribution company is maximized while each DG unit's individual profit is also taken into account to assure that private DG investment remains economical.
Robust portfolio selection based on asymmetric measures of variability of stock returns
NASA Astrophysics Data System (ADS)
Chen, Wei; Tan, Shaohua
2009-10-01
This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seitz, R.
2011-03-02
It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest inmore » the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Díez, C.J., E-mail: cj.diez@upm.es; Cabellos, O.; Instituto de Fusión Nuclear, Universidad Politécnica de Madrid, 28006 Madrid
Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has tomore » be performed in order to analyse the limitations of using one-group uncertainties.« less
NASA Astrophysics Data System (ADS)
Díez, C. J.; Cabellos, O.; Martínez, J. S.
2015-01-01
Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.
Verbeke, Wim; Marcu, Afrodita; Rutsaert, Pieter; Gaspar, Rui; Seibt, Beate; Fletcher, Dave; Barnett, Julie
2015-04-01
Cultured meat has evolved from an idea and concept into a reality with the August 2013 cultured hamburger tasting in London. Still, how consumers conceive cultured meat is largely an open question. This study addresses consumers' reactions and attitude formation towards cultured meat through analyzing focus group discussions and online deliberations with 179 meat consumers from Belgium, Portugal and the United Kingdom. Initial reactions when learning about cultured meat were underpinned by feelings of disgust and considerations of unnaturalness. Consumers saw few direct personal benefits but they were more open to perceiving global societal benefits relating to the environment and global food security. Both personal and societal risks were framed in terms of uncertainties about safety and health, and possible adverse societal consequences dealing with loss of farming and eating traditions and rural livelihoods. Further reflection pertained to skepticism about 'the inevitable' scientific progress, concern about risk governance and control, and need for regulation and proper labeling. Copyright © 2014 Elsevier Ltd. All rights reserved.
PARTNERING TO IMPROVE HUMAN EXPOSURE METHODS
Methods development research is an application-driven scientific area that addresses programmatic needs. The goals are to reduce measurement uncertainties, address data gaps, and improve existing analytical procedures for estimating human exposures. Partnerships have been develop...
NASA Astrophysics Data System (ADS)
Croke, B. F.
2008-12-01
The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
Liu, Wu; Gong, Yaping; Liu, Jun
2014-05-01
Drawing upon the notion of managerial discretion from upper echelons theory, we theorize which external contingencies moderate the relationship between collective organizational citizenship behavior (COCB) and unit performance. Focusing on business unit (BU) management teams, we hypothesize that COCB of BU management teams enhances BU performance and that this impact depends on environmental uncertainty and BU management-team decision latitude, 2 determinants of managerial discretion. In particular, the positive effect of COCB is stronger when environmental uncertainty or the BU management-team decision latitude is greater. Time-lagged data from 109 BUs of a telecommunications company support the hypotheses. Additional exploratory analysis shows that the positive moderating effect of environmental uncertainty is further amplified at higher levels of BU management-team decision latitude. Overall, this study extends the internally focused view in the micro OCB literature by introducing external contingencies for the COCB-unit-performance relationship. (c) 2014 APA, all rights reserved.
Resolving dust emission responses to land cover change using an ecological land classification
USDA-ARS?s Scientific Manuscript database
Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of ...
Grammar and the English National Curriculum
ERIC Educational Resources Information Center
Paterson, Laura Louise
2010-01-01
In 1998 the regulatory body for the National Curriculum, the Qualifications and Curriculum Authority, acknowledged that there was "widespread uncertainty" over the grammar requirements of the English Curriculum. In this paper I argue that the QCA still has not addressed this uncertainty. I analyse the 1999 and 2011 Primary English…
An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...
DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT
An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2015-09-01
Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.
Disseminating the unit of mass from multiple primary realisations
NASA Astrophysics Data System (ADS)
Nielsen, Lars
2016-12-01
When a new definition of the kilogram has been adopted in 2018 as expected, the unit of mass will be realised by the watt balance method, the x-ray crystal density method or perhaps other primary methods still to be developed. So far, the standard uncertainties associated with the available primary methods are at least one order of magnitude larger than the standard uncertainty associated with mass comparisons using mass comparators, so differences in primary realisations of the kilogram are easily detected, whereas many National Metrology Institutes would have to increase their calibration and measurement capabilities (CMCs) if they were traceable to a single primary realisation. This paper presents a scheme for obtaining traceability to multiple primary realisations of the kilogram using a small group of stainless steel 1 kg weights, which are allowed to change their masses over time in a way known to be realistic, and which are calibrated and stored in air. An analysis of the scheme shows that if the relative standard uncertainties of future primary realisations are equal to the relative standard uncertainties of the present methods used to measure the Planck constant, the unit of mass can be disseminated with a standard uncertainty less than 0.015 mg, which matches the smallest CMCs currently claimed for the calibration of 1 kg weights.
How to Assess Vulnerabilities of Water Policies to Global Change?
NASA Astrophysics Data System (ADS)
Kumar, A.; Haasnoot, M.; Weijs, S.
2017-12-01
Water managers are confronted with uncertainties arising from hydrological, societal, economical and political drivers. To manage these uncertainties two paradigms have been identified: top-down and bottom-up approaches. Top-down or prediction-based approaches use socio-economic scenarios together with a discrete set of GCM projections (often downscaled) to assess the expected impact of drivers and policies on water resource system through various hydrological and social systems models. Adaptation strategies to alleviate these impacts are then identified and tested against the scenarios. To address GCM and downscaling uncertainties, these approaches put more focus on climate predictions, rather than the decision problem itself. Triggered by the wish to have a more scenario-neutral approach and address downscaling uncertainties, recent analyses have been shifted towards vulnerability-based (bottom-up or decision-centric) approaches. They begin at the local scale by addressing socio-economic responses to climate, often involving stakeholder's input; identify vulnerabilities under a larger sample of plausible futures and evaluate sensitivity and robustness of possible adaptation options. Several bottom-up approaches have emerged so far and are increasingly recommended. Fundamentally they share several core ideas, however, subtle differences exist in vulnerability assessment, visualization tools for exploring vulnerabilities and computational methods used for identifying robust water policies. Through this study, we try to identify how these approaches are progressing, how the climate and non-climate uncertainties are being confronted and how to integrate existing and new tools. We find that choice of a method may depend on the number of vulnerability drivers identified and type of threshold levels (environmental conditions or policy objectives) defined. Certain approaches are suited well for assessing adaptive capacities, tipping points and sequencing of decisions. However, visualization of the vulnerability domain is still challenging if multiple drivers are present. New emerging tools are focused on generating synthetic scenarios, addressing multiple objectives, linking decision-making frameworks to adaptation pathways and communicating risks to the stakeholders.
Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna
Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information aboutmore » the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.« less
Toward Scientific Numerical Modeling
NASA Technical Reports Server (NTRS)
Kleb, Bil
2007-01-01
Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.
Characteristics of Organizational Environments and Perceived Environmental Uncertainty
ERIC Educational Resources Information Center
Duncan, Robert B.
1972-01-01
Twenty-two decision groups in three manufacturing and three research and development organizations are studied to identify the characteristics of the environment that contribute to decision unit members experiencing uncertainty in decisionmaking. (Author)
NASA Astrophysics Data System (ADS)
Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus
2017-04-01
Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
A Comparative Study of Uncertainty Reduction Theory in High- and Low-Context Cultures.
ERIC Educational Resources Information Center
Kim, Myoung-Hye; Yoon, Tae-Jin
To test the cross-cultural validity of uncertainty reduction theory, a study was conducted using students from South Korea and the United States who were chosen to represent high- and low-context cultures respectively. Uncertainty reduction theory is based upon the assumption that the primary concern of strangers upon meeting is one of uncertainty…
Wu, Y.; Liu, S.
2012-01-01
Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.
Nonlinear adaptive control of grid-connected three-phase inverters for renewable energy applications
NASA Astrophysics Data System (ADS)
Mahdian-Dehkordi, N.; Namvar, M.; Karimi, H.; Piya, P.; Karimi-Ghartemani, M.
2017-01-01
Distributed generation (DG) units are often interfaced to the main grid using power electronic converters including voltage-source converters (VSCs). A VSC offers dc/ac power conversion, high controllability, and fast dynamic response. Because of nonlinearities, uncertainties, and system parameters' changes involved in the nature of a grid-connected renewable DG system, conventional linear control methods cannot completely and efficiently address all control objectives. In this paper, a nonlinear adaptive control scheme based on adaptive backstepping strategy is presented to control the operation of a grid-connected renewable DG unit. As compared to the popular vector control technique, the proposed controller offers smoother transient responses, and lower level of current distortions. The Lyapunov approach is used to establish global asymptotic stability of the proposed control system. Linearisation technique is employed to develop guidelines for parameters tuning of the controller. Extensive time-domain digital simulations are performed and presented to verify the performance of the proposed controller when employed in a VSC to control the operation of a two-stage DG unit and also that of a single-stage solar photovoltaic system. Desirable and superior performance of the proposed controller is observed.
NASA Astrophysics Data System (ADS)
Kearsey, Tim; Williams, John; Finlayson, Andrew; Williamson, Paul; Dobbs, Marcus; Kingdon, Andrew; Campbell, Diarmad
2014-05-01
Geological maps and 3D models usually depict lithostragraphic units which can comprise of many different types of sediment (lithologies). The lithostratigraphic units shown on maps and 3D models of glacial and post glacial deposits in Glasgow are substantially defined by the method of the formation and age of the unit rather than its lithological composition. Therefore, a simple assumption that the dominant lithology is the most common constituent of any stratigraphic unit is erroneous and is only 58% predictive of the actual sediment types seen in a borehole. This is problematic for non-geologist such as planners, regulators and engineers attempting to use these models to inform their decisions and can lead to such users viewing maps and models as of limited use in such decision making. We explore the extent to which stochastic modelling can help to make geological models more predictive of lithology in heterolithic units. Stochastic modelling techniques are commonly used to model facies variations in oil field models. The techniques have been applied to an area containing >4000 coded boreholes to investigate the glacial and fluvial deposits in the centre of the city of Glasgow. We test the predictions from this method by deleting percentages of the control data and re-running the simulations to determine how predictability varies with data density. We also explore the best way of displaying such stochastic models to and suggest that displaying the data as probability maps rather than a single definitive answer better illustrates the uncertainties inherent in the input data. Finally we address whether is it possible truly to be able to predict lithology in such geological facies. The innovative Accessing Subsurface Knowledge (ASK) network was recently established in the Glasgow are by the British Geological Survey and Glasgow City Council to deliver and exchange subsurface data and knowledge. This provides an idea opportunity to communicate and test a range of models and to assess their usefulness and impact on a vibrant community of public and private sector partners and decision makers.
Introducing Risk Analysis and Calculation of Profitability under Uncertainty in Engineering Design
ERIC Educational Resources Information Center
Kosmopoulou, Georgia; Freeman, Margaret; Papavassiliou, Dimitrios V.
2011-01-01
A major challenge that chemical engineering graduates face at the modern workplace is the management and operation of plants under conditions of uncertainty. Developments in the fields of industrial organization and microeconomics offer tools to address this challenge with rather well developed concepts, such as decision theory and financial risk…
The National Center for Environmental Assessment (NCEA) has conducted and supported research addressing uncertainties in 2-stage clonal growth models for cancer as applied to formaldehyde. In this report, we summarized publications resulting from this research effort, discussed t...
Balancing Certainty and Uncertainty in Clinical Practice
ERIC Educational Resources Information Center
Kamhi, Alan G.
2011-01-01
Purpose: In this epilogue, I respond to each of the five commentaries, discussing in some depth a central issue raised in each commentary. In the final section, I discuss how my thinking about certainty and uncertainty in clinical practice has evolved since I wrote the initial article. Method: Topics addressed include the similarities/differences…
Biological Impact of the Chippewa Off-Reservation Treaty Harvest, 1983-1989.
ERIC Educational Resources Information Center
Busiahn, Thomas R.
1991-01-01
In Wisconsin, Chippewa tribal harvests have not had a negative impact on populations of lake trout, walleye, fishers, and white-tailed deer. The treaty rights controversy is fueled by uncertainties about the status of natural resources, uncertainties that could be addressed by cooperative state-tribal wildlife management programs. (SV)
Addressing uncertainty in atomistic machine learning.
Peterson, Andrew A; Christensen, Rune; Khorshidi, Alireza
2017-05-10
Machine-learning regression has been demonstrated to precisely emulate the potential energy and forces that are output from more expensive electronic-structure calculations. However, to predict new regions of the potential energy surface, an assessment must be made of the credibility of the predictions. In this perspective, we address the types of errors that might arise in atomistic machine learning, the unique aspects of atomistic simulations that make machine-learning challenging, and highlight how uncertainty analysis can be used to assess the validity of machine-learning predictions. We suggest this will allow researchers to more fully use machine learning for the routine acceleration of large, high-accuracy, or extended-time simulations. In our demonstrations, we use a bootstrap ensemble of neural network-based calculators, and show that the width of the ensemble can provide an estimate of the uncertainty when the width is comparable to that in the training data. Intriguingly, we also show that the uncertainty can be localized to specific atoms in the simulation, which may offer hints for the generation of training data to strategically improve the machine-learned representation.
Inexact Socio-Dynamic Modeling of Groundwater Contamination Management
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Zhang, X.
2015-12-01
Groundwater contamination may alter the behaviors of the public such as adaptation to such a contamination event. On the other hand, social behaviors may affect groundwater contamination and associated risk levels such as through changing ingestion amount of groundwater due to the contamination. Decisions should consider not only the contamination itself, but also social attitudes on such contamination events. Such decisions are inherently associated with uncertainty, such as subjective judgement from decision makers and their implicit knowledge on selection of whether to supply water or reduce the amount of supplied water under the scenario of the contamination. A socio-dynamic model based on the theories of information-gap and fuzzy sets is being developed to address the social behaviors facing the groundwater contamination and applied to a synthetic problem designed based on typical groundwater remediation sites where the effects of social behaviors on decisions are investigated and analyzed. Different uncertainties including deep uncertainty and vague/ambiguous uncertainty are effectively and integrally addressed. The results can provide scientifically-defensible decision supports for groundwater management in face of the contamination.
Exploration of quantum-memory-assisted entropic uncertainty relations in a noninertial frame
NASA Astrophysics Data System (ADS)
Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Shi, Jia-Dong; Ye, Liu
2017-05-01
The uncertainty principle offers a bound to show accuracy of the simultaneous measurement outcome for two incompatible observables. In this letter, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) when the particle to be measured stays at an open system, and another particle is treated as quantum memory under a noninertial frame. In such a scenario, the collective influence of the unital and nonunital noise environment, and of the relativistic motion of the system, on the QMA-EUR is examined. By numerical analysis, we conclude that, firstly, the noises and the Unruh effect can both increase the uncertainty, due to the decoherence of the bipartite system induced by the noise or Unruh effect; secondly, the uncertainty is more affected by the noises than by the Unruh effect from the acceleration; thirdly, unital noises can reduce the uncertainty in long-time regime. We give a possible physical interpretation for those results: that the information of interest is redistributed among the bipartite, the noisy environment and the physically inaccessible region in the noninertial frame. Therefore, we claim that our observations provide an insight into dynamics of the entropic uncertainty in a noninertial frame, and might be important to quantum precision measurement under relativistic motion.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans
2015-04-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.
2015-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Robust Unit Commitment Considering Uncertain Demand Response
Liu, Guodong; Tomsovic, Kevin
2014-09-28
Although price responsive demand response has been widely accepted as playing an important role in the reliable and economic operation of power system, the real response from demand side can be highly uncertain due to limited understanding of consumers' response to pricing signals. To model the behavior of consumers, the price elasticity of demand has been explored and utilized in both research and real practice. However, the price elasticity of demand is not precisely known and may vary greatly with operating conditions and types of customers. To accommodate the uncertainty of demand response, alternative unit commitment methods robust to themore » uncertainty of the demand response require investigation. In this paper, a robust unit commitment model to minimize the generalized social cost is proposed for the optimal unit commitment decision taking into account uncertainty of the price elasticity of demand. By optimizing the worst case under proper robust level, the unit commitment solution of the proposed model is robust against all possible realizations of the modeled uncertain demand response. Numerical simulations on the IEEE Reliability Test System show the e ectiveness of the method. Finally, compared to unit commitment with deterministic price elasticity of demand, the proposed robust model can reduce the average Locational Marginal Prices (LMPs) as well as the price volatility.« less
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
Wong, Alfred Ka-Shing; Ong, Shu Fen; Matchar, David Bruce; Lie, Desiree; Ng, Reuben; Yoon, Kirsten Eom; Wong, Chek Hooi
2017-10-01
Studies are needed to inform the preparation of community nurses to address patient behavioral and social factors contributing to unnecessary readmissions to hospital. This study uses nurses' input to understand challenges faced during home care, to derive a framework to address the challenges. Semistructured interviews were conducted to saturation with 16 community nurses in Singapore. Interviews were transcribed verbatim and transcripts independently coded for emergent themes. Themes were interpreted using grounded theory. Seven major themes emerged from 16 interviews: Strained social relationships, complex care decision-making processes within families, communication barriers, patient's or caregiver neglect of health issues, building and maintaining trust, trial-and-error nature of work, and dealing with uncertainty. Community nurses identified uncertainty arising from complexities in social-relational, personal, and organizational factors as a central challenge. Nursing education should focus on navigating and managing uncertainty at the personal, patient, and family levels.
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Kaiser, Maria; Špačková, Olga; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2018-05-01
Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.
Sammarco, Angela; Konecny, Lynda M
2010-01-01
To examine the differences between Latina and Caucasian breast cancer survivors in perceived social support, uncertainty, and quality of life (QOL), and the differences between the cohorts in selected demographic variables. Descriptive, comparative study. Selected private hospitals and American Cancer Society units in a metropolitan area of the northeastern United States. 182 Caucasian and 98 Latina breast cancer survivors. Participants completed a personal data sheet, the Social Support Questionnaire, the Mishel Uncertainty in Illness Scale-Community Form, and the Ferrans and Powers QOL Index-Cancer Version III at home and returned the questionnaires to the investigators via postage-paid envelope. Perceived social support, uncertainty, and QOL. Caucasians reported significantly higher levels of total perceived social support and QOL than Latinas. Psychiatric illness comorbidity and lower level of education in Latinas were factors in the disparity of QOL. Nurses should be mindful of the essential association of perceived social support, uncertainty, and QOL in Latina breast cancer survivors and how Latinas differ from Caucasian breast cancer survivors. Factors such as cultural values, comorbidities, and education level likely influence perceived social support, uncertainty, and QOL.
Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max
2015-10-01
Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, K T
2007-01-30
As reflected in the 2005 USEPA Guidelines for Cancer Risk Assessment, some chemical carcinogens may have a site-specific mode of action (MOA) that is dual, involving mutation in addition to cell-killing induced hyperplasia. Although genotoxicity may contribute to increased risk at all doses, the Guidelines imply that for dual MOA (DMOA) carcinogens, judgment be used to compare and assess results obtained using separate ''linear'' (genotoxic) vs. ''nonlinear'' (nongenotoxic) approaches to low-level risk extrapolation. However, the Guidelines allow the latter approach to be used only when evidence is sufficient to parameterize a biologically based model that reliably extrapolates risk to lowmore » levels of concern. The Guidelines thus effectively prevent MOA uncertainty from being characterized and addressed when data are insufficient to parameterize such a model, but otherwise clearly support a DMOA. A bounding factor approach--similar to that used in reference dose procedures for classic toxicity endpoints--can address MOA uncertainty in a way that avoids explicit modeling of low-dose risk as a function of administered or internal dose. Even when a ''nonlinear'' toxicokinetic model cannot be fully validated, implications of DMOA uncertainty on low-dose risk may be bounded with reasonable confidence when target tumor types happen to be extremely rare. This concept was illustrated for the rodent carcinogen naphthalene. Bioassay data, supplemental toxicokinetic data, and related physiologically based pharmacokinetic and 2-stage stochastic carcinogenesis modeling results all clearly indicate that naphthalene is a DMOA carcinogen. Plausibility bounds on rat-tumor-type specific DMOA-related uncertainty were obtained using a 2-stage model adapted to reflect the empirical link between genotoxic and cytotoxic effects of the most potent identified genotoxic naphthalene metabolites, 1,2- and 1,4-naphthoquinone. Resulting bounds each provided the basis for a corresponding ''uncertainty'' factor <1 appropriate to apply to estimates of naphthalene risk obtained by linear extrapolation under a default genotoxic MOA assumption. This procedure is proposed as scientifically credible method to address MOA uncertainty for DMOA carcinogens.« less
Issues and recent advances in optimal experimental design for site investigation (Invited)
NASA Astrophysics Data System (ADS)
Nowak, W.
2013-12-01
This presentation provides an overview over issues and recent advances in model-based experimental design for site exploration. The addressed issues and advances are (1) how to provide an adequate envelope to prior uncertainty, (2) how to define the information needs in a task-oriented manner, (3) how to measure the expected impact of a data set that it not yet available but only planned to be collected, and (4) how to perform best the optimization of the data collection plan. Among other shortcomings of the state-of-the-art, it is identified that there is a lack of demonstrator studies where exploration schemes based on expert judgment are compared to exploration schemes obtained by optimal experimental design. Such studies will be necessary do address the often voiced concern that experimental design is an academic exercise with little improvement potential over the well- trained gut feeling of field experts. When addressing this concern, a specific focus has to be given to uncertainty in model structure, parameterizations and parameter values, and to related surprises that data often bring about in field studies, but never in synthetic-data based studies. The background of this concern is that, initially, conceptual uncertainty may be so large that surprises are the rule rather than the exception. In such situations, field experts have a large body of experience in handling the surprises, and expert judgment may be good enough compared to meticulous optimization based on a model that is about to be falsified by the incoming data. In order to meet surprises accordingly and adapt to them, there needs to be a sufficient representation of conceptual uncertainty within the models used. Also, it is useless to optimize an entire design under this initial range of uncertainty. Thus, the goal setting of the optimization should include the objective to reduce conceptual uncertainty. A possible way out is to upgrade experimental design theory towards real-time interaction with the ongoing site investigation, such that surprises in the data are immediately accounted for to restrict the conceptual uncertainty and update the optimization of the plan.
Commentary: ambiguity and uncertainty: neglected elements of medical education curricula?
Luther, Vera P; Crandall, Sonia J
2011-07-01
Despite significant advances in scientific knowledge and technology, ambiguity and uncertainty are still intrinsic aspects of contemporary medicine. To practice confidently and competently, a physician must learn rational approaches to complex and ambiguous clinical scenarios and must possess a certain degree of tolerance of ambiguity. In this commentary, the authors discuss the role that ambiguity and uncertainty play in medicine and emphasize why openly addressing these topics in the formal medical education curriculum is critical. They discuss key points from original research by Wayne and colleagues and their implications for medical education. Finally, the authors offer recommendations for increasing medical student tolerance of ambiguity and uncertainty, including dedicating time to attend candidly to ambiguity and uncertainty as a formal part of every medical school curriculum.
NASA Astrophysics Data System (ADS)
Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.
2018-01-01
This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
Uncertainty in weather and climate prediction
Slingo, Julia; Palmer, Tim
2011-01-01
Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation. PMID:22042896
Michael J. Papaik; Andrew Fall; Brian Sturtevant; Daniel Kneeshaw; Christian Messier; Marie-Josee Fortin; Neal Simon
2010-01-01
Forest management practices conducted primarily at the stand scale result in simplified forests with regeneration problems and low structural and biological diversity. Landscape models have been used to help design management strategies to address these problems. However, there remains a great deal of uncertainty that the actual management practices result in the...
Adaptive Missile Flight Control for Complex Aerodynamic Phenomena
2017-08-09
at high maneuvering conditions motivate guidance approaches that can accommodate uncertainty. Flight control algorithms are one component...performance, but system uncertainty is not directly addressed. Linear, parameter-varying37,38 approaches for munitions expand on optimal control by... post -canard stall. We propose to model these complex aerodynamic mechanisms and use these models in formulating flight controllers within the
Lesson Unplanning: Toward Transforming Routine Tasks into Non-Routine Problems
ERIC Educational Resources Information Center
Beghetto, Ronald A.
2017-01-01
How might teachers transform routine tasks into non-routine ones? The purpose of this article is to address this question. The article opens with a discussion of why non-routine problems require creative and original thought. Specifically, I discuss how non-routine problems require students to confront uncertainty and how uncertainty can serve as…
Poonam Khanijo Ahluwalia; Nema, Arvind K
2011-07-01
Selection of optimum locations for locating new facilities and decision regarding capacities at the proposed facilities is a major concern for municipal authorities/managers. The decision as to whether a single facility is preferred over multiple facilities of smaller capacities would vary with varying priorities to cost and associated risks such as environmental or health risk or risk perceived by the society. Currently management of waste streams such as that of computer waste is being done using rudimentary practices and is flourishing as an unorganized sector, mainly as backyard workshops in many cities of developing nations such as India. Uncertainty in the quantification of computer waste generation is another major concern due to the informal setup of present computer waste management scenario. Hence, there is a need to simultaneously address uncertainty in waste generation quantities while analyzing the tradeoffs between cost and associated risks. The present study aimed to address the above-mentioned issues in a multi-time-step, multi-objective decision-support model, which can address multiple objectives of cost, environmental risk, socially perceived risk and health risk, while selecting the optimum configuration of existing and proposed facilities (location and capacities).
Is There Hope? Is She There? How Families and Clinicians Experience Severe Acute Brain Injury.
Schutz, Rachael E C; Coats, Heather L; Engelberg, Ruth A; Curtis, J Randall; Creutzfeldt, Claire J
2017-02-01
Patients with severe acute brain injury (SABI) raise important palliative care considerations associated with sudden devastating injury and uncertain prognosis. The goal of this study was to explore how family members, nurses, and physicians experience the palliative and supportive care needs of patients with SABI receiving care in the neuroscience intensive care unit (neuro-ICU). Semistructured interviews were audiotaped, transcribed, and analyzed using thematic analysis. Thirty-bed neuro-ICU in a regional comprehensive stroke and level-one trauma center in the United States. We completed 47 interviews regarding 15 patients with family members (n = 16), nurses (n = 15), and physicians (n = 16). Two themes were identified: (1) hope and (2) personhood. (1) Families linked prognostic uncertainty to a need for hope and expressed a desire for physicians to acknowledge this relationship. The language of hope varied depending on the participant: clinicians used hope as an object that can be given or taken away, generally in the process of conveying prognosis, while families expressed hope as an action that supported coping with their loved one's acute illness and its prognostic uncertainty. (2) Participants described the loss of personhood through brain injury, the need to recognize and treat the brain-injured patient as a person, and the importance of relatedness and connection, including personal support of families by clinicians. Support for hope and preservation of personhood challenge care in the neuro-ICU as identified by families and clinicians of patients with SABI. Specific practical approaches can address these challenges and improve the palliative care provided to patients and families in the neuro-ICU.
NASA Astrophysics Data System (ADS)
Salvi, Kaustubh; Villarini, Gabriele; Vecchi, Gabriel A.
2017-10-01
Unprecedented alterations in precipitation characteristics over the last century and especially in the last two decades have posed serious socio-economic problems to society in terms of hydro-meteorological extremes, in particular flooding and droughts. The origin of these alterations has its roots in changing climatic conditions; however, its threatening implications can only be dealt with through meticulous planning that is based on realistic and skillful decadal precipitation predictions (DPPs). Skillful DPPs represent a very challenging prospect because of the complexities associated with precipitation predictions. Because of the limited skill and coarse spatial resolution, the DPPs provided by General Circulation Models (GCMs) fail to be directly applicable for impact assessment. Here, we focus on nine GCMs and quantify the seasonally and regionally averaged skill in DPPs over the continental United States. We address the problems pertaining to the limited skill and resolution by applying linear and kernel regression-based statistical downscaling approaches. For both the approaches, statistical relationships established over the calibration period (1961-1990) are applied to the retrospective and near future decadal predictions by GCMs to obtain DPPs at ∼4 km resolution. The skill is quantified across different metrics that evaluate potential skill, biases, long-term statistical properties, and uncertainty. Both the statistical approaches show improvements with respect to the raw GCM data, particularly in terms of the long-term statistical properties and uncertainty, irrespective of lead time. The outcome of the study is monthly DPPs from nine GCMs with 4-km spatial resolution, which can be used as a key input for impacts assessments.
Characterizing Drought Events from a Hydrological Model Ensemble
NASA Astrophysics Data System (ADS)
Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia
2017-04-01
Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event detection and characterization. This ensemble approach allows for uncertainty estimates and confidence intervals to be explored in simulations of drought event characteristics, such as duration and severity, which would not otherwise be available from a deterministic approach. The acquired understanding of uncertainty in drought events may then be applied to historic drought reconstructions, supplying evidence which could prove vital in decision making scenarios.
NASA Astrophysics Data System (ADS)
Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.
2018-03-01
The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.
Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark
2011-01-01
Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.
Uncertainty Assessment: What Good Does it Do? (Invited)
NASA Astrophysics Data System (ADS)
Oreskes, N.; Lewandowsky, S.
2013-12-01
The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Bass Monitoring Committee shall identify and review the relevant sources of management uncertainty to... sources of management uncertainty that were considered, technical approaches to mitigating these sources..., DEPARTMENT OF COMMERCE FISHERIES OF THE NORTHEASTERN UNITED STATES Management Measures for the Black Sea Bass...
Code of Federal Regulations, 2011 CFR
2011-10-01
... management uncertainty to recommend ACTs for the commercial and recreational fishing sectors as part of the... sources of management uncertainty that were considered, technical approaches to mitigating these sources..., DEPARTMENT OF COMMERCE FISHERIES OF THE NORTHEASTERN UNITED STATES Management Measures for the Scup Fishery...
NASA Astrophysics Data System (ADS)
Melious, J. O.
2012-12-01
In the northwestern corner of Washington state, a large landslide on Sumas Mountain deposits more than 100,000 cubic yards of soil containing asbestos fibers and heavy metals into Swift Creek every year. Engineers predict that asbestos-laden soils will slide into Swift Creek for at least the next 400 years. Swift Creek joins the Sumas River, which crosses the border into Canada, serving as an international delivery system for asbestos-laden soils. When the rivers flood, as happens regularly, they deliver asbestos into field, yards, and basements. The tools available to address the Swift Creek situation are at odds with the scope and nature of the problem. Asbestos regulation primarily addresses occupational settings, where exposures can be estimated. Hazardous waste regulation primarily addresses liability for abandoned waste products from human activities. Health and environmental issues relating to naturally occurring asbestos (NOA) are fundamentally different from either regulatory scheme. Liability is not a logical lever for a naturally occurring substance, the existence of which is nobody's fault, and exposures to NOA in the environment do not necessarily resemble occupational exposures. The gaps and flaws in the legal regime exacerbate the uncertainties created by uncertainties in the science. Once it is assumed that no level of exposure is safe, legal requirements adopted in very different contexts foreclose the options for addressing the Swift Creek problem. This presentation will outline the applicable laws and how they intersect with issues of risk perception, uncertainty and politics in efforts to address the Swift Creek NOA site.
Venkatesh, Aranya; Jaramillo, Paulina; Griffin, W Michael; Matthews, H Scott
2011-10-01
Increasing concerns about greenhouse gas (GHG) emissions in the United States have spurred interest in alternate low carbon fuel sources, such as natural gas. Life cycle assessment (LCA) methods can be used to estimate potential emissions reductions through the use of such fuels. Some recent policies have used the results of LCAs to encourage the use of low carbon fuels to meet future energy demands in the U.S., without, however, acknowledging and addressing the uncertainty and variability prevalent in LCA. Natural gas is a particularly interesting fuel since it can be used to meet various energy demands, for example, as a transportation fuel or in power generation. Estimating the magnitudes and likelihoods of achieving emissions reductions from competing end-uses of natural gas using LCA offers one way to examine optimal strategies of natural gas resource allocation, given that its availability is likely to be limited in the future. In this study, the uncertainty in life cycle GHG emissions of natural gas (domestic and imported) consumed in the U.S. was estimated using probabilistic modeling methods. Monte Carlo simulations are performed to obtain sample distributions representing life cycle GHG emissions from the use of 1 MJ of domestic natural gas and imported LNG. Life cycle GHG emissions per energy unit of average natural gas consumed in the U.S were found to range between -8 and 9% of the mean value of 66 g CO(2)e/MJ. The probabilities of achieving emissions reductions by using natural gas for transportation and power generation, as a substitute for incumbent fuels such as gasoline, diesel, and coal were estimated. The use of natural gas for power generation instead of coal was found to have the highest and most likely emissions reductions (almost a 100% probability of achieving reductions of 60 g CO(2)e/MJ of natural gas used), while there is a 10-35% probability of the emissions from natural gas being higher than the incumbent if it were used as a transportation fuel. This likelihood of an increase in GHG emissions is indicative of the potential failure of a climate policy targeting reductions in GHG emissions.
Research, Practice, Uncertainty and Responsibility
ERIC Educational Resources Information Center
Skovsmose, Ole
2006-01-01
Three issues concerning the relationship between research and practice are addressed. (1) A certain "prototype mathematics classroom" seems to dominate the research field, which in many cases seems selective with respect to what practices to address. I suggest challenging the dominance of the discourse created around the prototype mathematics…
Episodic acidification of small streams in the northeastern united states: episodic response project
Wigington, P.J.; Baker, J.P.; DeWalle, David R.; Kretser, W.A.; Murdoch, Peter S.; Simonin, H.A.; Van Sickle, J.; Mcdowell, M.K.; Peck, D.V.; Barchet, W.R.
1996-01-01
The Episodic Response Project (ERP) was an interdisciplinary study designed to address uncertainties about the occurrence, nature, and biological effects of episodic acidification of streams in the northeastern United States. The ERP research consisted of intensive studies of the chemistry and biological effects of episodes in 13 streams draining forested watersheds in the three study regions: the Northern Appalachian region of Pennsylvania and the Catskill and Adirondack Mountains of New York. Wet deposition was measured in each of the three study regions. Using automated instruments and samplers, discharge and chemistry of each stream was monitored intensively from fall 1988 through spring 1990. Biological studies focused on brook trout and native forage fish. Experimental approaches included in situ bioassays, radio transmitter studies of fish movement, and fish population studies. This paper provides an overview of the ERP, describes the methodology used in hydrologic and water chemistry components of the study, and summarizes the characteristics of the study sites, including the climatic and deposition conditions during the ERP and the general chemical characteristics of the study streams.
Synthesis and Control of Flexible Systems with Component-Level Uncertainties
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Lim, Kyong B.
2009-01-01
An efficient and computationally robust method for synthesis of component dynamics is developed. The method defines the interface forces/moments as feasible vectors in transformed coordinates to ensure that connectivity requirements of the combined structure are met. The synthesized system is then defined in a transformed set of feasible coordinates. The simplicity of form is exploited to effectively deal with modeling parametric and non-parametric uncertainties at the substructure level. Uncertainty models of reasonable size and complexity are synthesized for the combined structure from those in the substructure models. In particular, we address frequency and damping uncertainties at the component level. The approach first considers the robustness of synthesized flexible systems. It is then extended to deal with non-synthesized dynamic models with component-level uncertainties by projecting uncertainties to the system level. A numerical example is given to demonstrate the feasibility of the proposed approach.
[Ethics, empiricism and uncertainty].
Porz, R; Zimmermann, H; Exadaktylos, A K
2011-01-01
Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine. © Georg Thieme Verlag KG Stuttgart · New York.
Nolan, Bernard T.; Malone, Robert W.; Doherty, John E.; Barbash, Jack E.; Ma, Liwang; Shaner, Dale L.
2015-01-01
CONCLUSIONS: Although the observed data were sparse, they substantially reduced prediction uncertainty in unsampled regions of pesticide breakthrough curves. Nitrate evidently functioned as a surrogate for soil hydraulic data in well-drained loam soils conducive to conservative transport of nitrogen. Pesticide properties and macropore parameters could most benefit from improved characterization further to reduce model misfit and prediction uncertainty.
Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest
Denys Yemshanov; Frank H. Koch; Yakov Ben-Haim; William D. Smith
2010-01-01
In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads...
NASA Astrophysics Data System (ADS)
Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.
2018-01-01
Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.
Di Vittorio, A. V.; Mao, J.; Shi, X.; ...
2018-01-03
Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Vittorio, A. V.; Mao, J.; Shi, X.
Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less
Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.
Hogg, Michael A; Adelman, Janice R; Blagg, Robert D
2010-02-01
The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.
Water resources in the twenty-first century; a study of the implications of climate uncertainty
Moss, Marshall E.; Lins, Harry F.
1989-01-01
The interactions of the water resources on and within the surface of the Earth with the atmosphere that surrounds it are exceedingly complex. Increased uncertainty can be attached to the availability of water of usable quality in the 21st century, therefore, because of potential anthropogenic changes in the global climate system. For the U.S. Geological Survey to continue to fulfill its mission with respect to assessing the Nation's water resources, an expanded program to study the hydrologic implications of climate uncertainty will be required. The goal for this program is to develop knowledge and information concerning the potential water-resources implications for the United States of uncertainties in climate that may result from both anthropogenic and natural changes of the Earth's atmosphere. Like most past and current water-resources programs of the Geological Survey, the climate-uncertainty program should be composed of three elements: (1) research, (2) data collection, and (3) interpretive studies. However, unlike most other programs, the climate-uncertainty program necessarily will be dominated by its research component during its early years. Critical new concerns to be addressed by the research component are (1) areal estimates of evapotranspiration, (2) hydrologic resolution within atmospheric (climatic) models at the global scale and at mesoscales, (3) linkages between hydrology and climatology, and (4) methodology for the design of data networks that will help to track the impacts of climate change on water resources. Other ongoing activities in U.S. Geological Survey research programs will be enhanced to make them more compatible with climate-uncertainty research needs. The existing hydrologic data base of the Geological Survey serves as a key element in assessing hydrologic and climatologic change. However, this data base has evolved in response to other needs for hydrologic information and probably is not as sensitive to climate change as is desirable. Therefore, as measurement and network-design methodologies are improved to account for climate-change potential, new data-collection activities will be added to the existing programs. One particular area of data-collection concern pertains to the phenomenon of evapotranspiration. Interpretive studies of the hydrologic implications of climate uncertainty will be initiated by establishing several studies at the river-basin scale in diverse hydroclimatic and demographic settings. These studies will serve as tests of the existing methodologies for studying the impacts of climate change and also will help to define subsequent research priorities. A prototype for these studies was initiated in early 1988 in the Delaware River basin.
The Business Case for Investing in Physician Well-being.
Shanafelt, Tait; Goh, Joel; Sinsky, Christine
2017-12-01
Widespread burnout among physicians has been recognized for more than 2 decades. Extensive evidence indicates that physician burnout has important personal and professional consequences. A lack of awareness regarding the economic costs of physician burnout and uncertainty regarding what organizations can do to address the problem have been barriers to many organizations taking action. Although there is a strong moral and ethical case for organizations to address physician burnout, financial principles (eg, return on investment) can also be applied to determine the economic cost of burnout and guide appropriate investment to address the problem. The business case to address physician burnout is multifaceted and includes costs associated with turnover, lost revenue associated with decreased productivity, as well as financial risk and threats to the organization's long-term viability due to the relationship between burnout and lower quality of care, decreased patient satisfaction, and problems with patient safety. Nearly all US health care organizations have used similar evidence to justify their investments in safety and quality. Herein, we provide conservative formulas based on readily available organizational characteristics to determine the financial return on organizational investments to reduce physician burnout. A model outlining the steps of the typical organization's journey to address this issue is presented. Critical ingredients to making progress include prioritization by leadership, physician involvement, organizational science/learning, metrics, structured interventions, open communication, and promoting culture change at the work unit, leader, and organization level. Understanding the business case to reduce burnout and promote engagement as well as overcoming the misperception that nothing meaningful can be done are key steps for organizations to begin to take action. Evidence suggests that improvement is possible, investment is justified, and return on investment measurable. Addressing this issue is not only the organization's ethical responsibility, it is also the fiscally responsible one.
Impact from Magnitude-Rupture Length Uncertainty on Seismic Hazard and Risk
NASA Astrophysics Data System (ADS)
Apel, E. V.; Nyst, M.; Kane, D. L.
2015-12-01
In probabilistic seismic hazard and risk assessments seismic sources are typically divided into two groups: fault sources (to model known faults) and background sources (to model unknown faults). In areas like the Central and Eastern United States and Hawaii the hazard and risk is driven primarily by background sources. Background sources can be modeled as areas, points or pseudo-faults. When background sources are modeled as pseudo-faults, magnitude-length or magnitude-area scaling relationships are required to construct these pseudo-faults. However the uncertainty associated with these relationships is often ignored or discarded in hazard and risk models, particularly when faults sources are the dominant contributor. Conversely, in areas modeled only with background sources these uncertainties are much more significant. In this study we test the impact of using various relationships and the resulting epistemic uncertainties on the seismic hazard and risk in the Central and Eastern United States and Hawaii. It is common to use only one magnitude length relationship when calculating hazard. However, Stirling et al. (2013) showed that for a given suite of magnitude-rupture length relationships the variability can be quite large. The 2014 US National Seismic Hazard Maps (Petersen et al., 2014) used one magnitude-rupture length relationship (Somerville, et al., 2001) in the Central and Eastern United States, and did not consider variability in the seismogenic rupture plane width. Here we use a suite of metrics to compare the USGS approach with these variable uncertainty models to assess 1) the impact on hazard and risk and 2) the epistemic uncertainty associated with choice of relationship. In areas where the seismic hazard is dominated by larger crustal faults (e.g. New Madrid) the choice of magnitude-rupture length relationship has little impact on the hazard or risk. However away from these regions, the choice of relationship is more significant and may approach the size of the uncertainty associated with the ground motion prediction equation suite.
CosmoSIS: Modular cosmological parameter estimation
Zuntz, J.; Paterno, M.; Jennings, E.; ...
2015-06-09
Cosmological parameter estimation is entering a new era. Large collaborations need to coordinate high-stakes analyses using multiple methods; furthermore such analyses have grown in complexity due to sophisticated models of cosmology and systematic uncertainties. In this paper we argue that modularity is the key to addressing these challenges: calculations should be broken up into interchangeable modular units with inputs and outputs clearly defined. Here we present a new framework for cosmological parameter estimation, CosmoSIS, designed to connect together, share, and advance development of inference tools across the community. We describe the modules already available in CosmoSIS, including CAMB, Planck, cosmicmore » shear calculations, and a suite of samplers. Lastly, we illustrate it using demonstration code that you can run out-of-the-box with the installer available at http://bitbucket.org/joezuntz/cosmosis« less
Quantifying parametric uncertainty in the Rothermel model
S. Goodrick
2008-01-01
The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...
Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leggett, Richard Wayne; Eckerman, Keith F; Meck, Robert A.
2008-10-01
This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intakemore » or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently studied radionuclides. (4) The biokinetics of a radionuclide in the human body typically represents the greatest source of uncertainty or variability in dose per unit intake. (5) Characterization of uncertainty in dose per unit exposure is generally a more straightforward problem for external exposure than for intake of a radionuclide. (6) For many radionuclides the most important outcome of a large-scale critical evaluation of databases and biokinetic models for radionuclides is expected to be the improvement of current models. Many of the current models do not fully or accurately reflect available radiobiological or physiological information, either because the models are outdated or because they were based on selective or uncritical use of data or inadequate model structures. In such cases the models should be replaced with physiologically realistic models that incorporate a wider spectrum of information.« less
NASA Astrophysics Data System (ADS)
Govindaraju, Parithi
Determining the optimal requirements for and design variable values of new systems, which operate along with existing systems to provide a set of overarching capabilities, as a single task is challenging due to the highly interconnected effects that setting requirements on a new system's design can have on how an operator uses this newly designed system. This task of determining the requirements and the design variable values becomes even more difficult because of the presence of uncertainties in the new system design and in the operational environment. This research proposed and investigated aspects of a framework that generates optimum design requirements of new, yet-to-be-designed systems that, when operating alongside other systems, will optimize fleet-level objectives while considering the effects of various uncertainties. Specifically, this research effort addresses the issues of uncertainty in the design of the new system through reliability-based design optimization methods, and uncertainty in the operations of the fleet through descriptive sampling methods and robust optimization formulations. In this context, fleet-level performance metrics result from using the new system alongside other systems to accomplish an overarching objective or mission. This approach treats the design requirements of a new system as decision variables in an optimization problem formulation that a user in the position of making an acquisition decision could solve. This solution would indicate the best new system requirements-and an associated description of the best possible design variable variables for that new system-to optimize the fleet level performance metric(s). Using a problem motivated by recorded operations of the United States Air Force Air Mobility Command for illustration, the approach is demonstrated first for a simplified problem that only considers demand uncertainties in the service network and the proposed methodology is used to identify the optimal design requirements and optimal aircraft sizing variables of new, yet-to-be-introduced aircraft. With this new aircraft serving alongside other existing aircraft, the fleet of aircraft satisfy the desired demand for cargo transportation, while maximizing fleet productivity and minimizing fuel consumption via a multi-objective problem formulation. The approach is then extended to handle uncertainties in both the design of the new system and in the operations of the fleet. The propagation of uncertainties associated with the conceptual design of the new aircraft to the uncertainties associated with the subsequent operations of the new and existing aircraft in the fleet presents some unique challenges. A computationally tractable hybrid robust counterpart formulation efficiently handles the confluence of the two types of domain-specific uncertainties. This hybrid formulation is tested on a larger route network problem to demonstrate the scalability of the approach. Following the presentation of the results obtained, a summary discussion indicates how decision-makers might use these results to set requirements for new aircraft that meet operational needs while balancing the environmental impact of the fleet with fleet-level performance. Comparing the solutions from the uncertainty-based and deterministic formulations via a posteriori analysis demonstrates the efficacy of the robust and reliability-based optimization formulations in addressing the different domain-specific uncertainties. Results suggest that the aircraft design requirements and design description determined through the hybrid robust counterpart formulation approach differ from solutions obtained from the simplistic deterministic approach, and leads to greater fleet-level fuel savings, when subjected to real-world uncertain scenarios (more robust to uncertainty). The research, though applied to a specific air cargo application, is technically agnostic in nature and can be applied to other facets of policy and acquisition management, to explore capability trade spaces for different vehicle systems, mitigate risks, define policy and potentially generate better returns on investment. Other domains relevant to policy and acquisition decisions could utilize the problem formulation and solution approach proposed in this dissertation provided that the problem can be split into a non-linear programming problem to describe the new system sizing and the fleet operations problem can be posed as a linear/integer programming problem.
Code of Federal Regulations, 2012 CFR
2012-04-01
... a minor's supervised account to a forwarding address left with the United States post office? 115... office forward mail regarding a minor's supervised account to a forwarding address left with the United... forwarded to an address left with the United States post office. The new address of record must be provided...
Code of Federal Regulations, 2013 CFR
2013-04-01
... a minor's supervised account to a forwarding address left with the United States post office? 115... office forward mail regarding a minor's supervised account to a forwarding address left with the United... forwarded to an address left with the United States post office. The new address of record must be provided...
Code of Federal Regulations, 2014 CFR
2014-04-01
... a minor's supervised account to a forwarding address left with the United States post office? 115... office forward mail regarding a minor's supervised account to a forwarding address left with the United... forwarded to an address left with the United States post office. The new address of record must be provided...
Code of Federal Regulations, 2010 CFR
2010-04-01
... forwarded to an address left with the United States post office. The new address of record must be provided... a minor's supervised account to a forwarding address left with the United States post office? 115... office forward mail regarding a minor's supervised account to a forwarding address left with the United...
Code of Federal Regulations, 2011 CFR
2011-04-01
... forwarded to an address left with the United States post office. The new address of record must be provided... a minor's supervised account to a forwarding address left with the United States post office? 115... office forward mail regarding a minor's supervised account to a forwarding address left with the United...
NASA Astrophysics Data System (ADS)
Islam, S.; Susskind, L.
2012-12-01
Most difficulties in water management are the product of rigid assumptions about how water ought to be allocated in the face of ever-increasing demand and growing uncertainty. When stakeholders face contending water claims, one of the biggest obstacles to reaching agreement is uncertainty. Specifically, there are three types of uncertainty that need to be addressed: uncertainty of information, uncertainty of action and uncertainty of perception. All three shape water management decisions. Contrary to traditional approaches, we argue that management of uncertainty needs to include both risks and opportunities. When parties treat water as a flexible rather than a fixed resource, opportunities to create value can be invented. When they use the right processes and mechanisms to enhance trust, even parties in conflict can reach agreements that satisfy their competing water needs and interests simultaneously. Using examples from several boundary crossing water cases we will show how this balance between risks and opportunities can be found to manage water resources for an uncertain future.
Grant, Evan H. Campbell; Muths, Erin L.; Katz, Rachel A.; Canessa, Stefano; Adams, Michael J.; Ballard, Jennifer R.; Berger, Lee; Briggs, Cheryl J.; Coleman, Jeremy; Gray, Matthew J.; Harris, M. Camille; Harris, Reid N.; Hossack, Blake R.; Huyvaert, Kathryn P.; Kolby, Jonathan E.; Lips, Karen R.; Lovich, Robert E.; McCallum, Hamish I.; Mendelson, Joseph R.; Nanjappa, Priya; Olson, Deanna H.; Powers, Jenny G.; Richgels, Katherine L. D.; Russell, Robin E.; Schmidt, Benedikt R.; Spitzen-van der Sluijs, Annemarieke; Watry, Mary Kay; Woodhams, Douglas C.; White, C. LeAnn
2016-01-20
The recently (2013) identified pathogenic chytrid fungus, Batrachochytrium salamandrivorans (Bsal), poses a severe threat to the distribution and abundance of salamanders within the United States and Europe. Development of a response strategy for the potential, and likely, invasion of Bsal into the United States is crucial to protect global salamander biodiversity. A formal working group, led by Amphibian Research and Monitoring Initiative (ARMI) scientists from the U.S. Geological Survey (USGS) Patuxent Wildlife Research Center, Fort Collins Science Center, and Forest and Rangeland Ecosystem Science Center, was held at the USGS Powell Center for Analysis and Synthesis in Fort Collins, Colorado, United States from June 23 to June 25, 2015, to identify crucial Bsal research and monitoring needs that could inform conservation and management strategies for salamanders in the United States. Key findings of the workshop included the following: (1) the introduction of Bsal into the United States is highly probable, if not inevitable, thus requiring development of immediate short-term and long-term intervention strategies to prevent Bsal establishment and biodiversity decline; (2) management actions targeted towards pathogen containment may be ineffective in reducing the long-term spread of Bsal throughout the United States; and (3) early detection of Bsal through surveillance at key amphibian import locations, among high-risk wild populations, and through analysis of archived samples is necessary for developing management responses. Top research priorities during the preinvasion stage included the following: (1) deployment of qualified diagnostic methods for Bsal and establishment of standardized laboratory practices, (2) assessment of susceptibility for amphibian hosts (including anurans), and (3) development and evaluation of short- and long-term pathogen intervention and management strategies. Several outcomes were achieved during the workshop, including development of an organizational structure with working groups for a Bsal Task Force, creation of an initial influence diagram to aid in identifying effective management actions in the face of uncertainty, and production of a list of potential management actions and key research uncertainties. Additional products under development include a Bsal Strategic Action plan, an emergency response plan, a monitoring and surveillance program, a standardized diagnostic approach, decision models for natural resource agencies, and a reporting database for salamander mortalities. This workshop was the first international meeting to address the threat of Bsal to salamander populations in the United States, with more than 30 participants from U.S. conservation and resource management agencies (U.S. Fish and Wildlife Service, U.S. Forest Service, U.S. Department of Defense, U.S. National Park Service, and Association of Fish and Wildlife Agencies) and academic research institutions in Australia, the Netherlands, Switzerland, the United Kingdom, and the United States.
Kazandjian, Vahé A; Lipitz-Snyderman, Allison
2011-12-01
To discuss the usefulness of health care information technology (HIT) in assisting care providers minimize uncertainty while simultaneously increasing efficiency of the care provided. An ongoing study of HIT, performance measurement (clinical and production efficiency) and their implications to the payment for care represents the design of this study. Since 2006, all Maryland hospitals have embarked on a multi-faceted study of performance measures and HIT adoption surveys, which will shape the health care payment model in Maryland, the last of the all-payor states, in 2011. This paper focuses on the HIT component of the Maryland care payment initiative. While the payment model is still under review and discussion, 'appropriateness' of care has been discussed as an important dimension of measurement. Within this dimension, the 'uncertainty' concept has been identified as associated with variation in care practices. Hence, the methods of this paper define how HIT can assist care providers in addressing the concept of uncertainty, and then provides findings from the first HIT survey in Maryland to infer the readiness of Maryland hospital in addressing uncertainty of care in part through the use of HIT. Maryland hospitals show noteworthy variation in their adoption and use of HIT. While computerized, electronic patient records are not commonly used among and across Maryland hospitals, many of the uses of HIT internally in each hospital could significantly assist in better communication about better practices to minimize uncertainty of care and enhance the efficiency of its production. © 2010 Blackwell Publishing Ltd.
IAEA activities on atomic, molecular and plasma-material interaction data for fusion
NASA Astrophysics Data System (ADS)
Braams, Bastiaan J.; Chung, Hyun-Kyung
2013-09-01
The IAEA Atomic and Molecular Data Unit (http://www-amdis.iaea.org/) aims to provide internationally evaluated and recommended data for atomic, molecular and plasma-material interaction (A+M+PMI) processes in fusion research. The Unit organizes technical meetings and coordinates an A+M Data Centre Network (DCN) and a Code Centre Network (CCN). In addition the Unit organizes Coordinated Research Projects (CRPs), for which the objectives are mixed between development of new data and evaluation and recommendation of existing data. In the area of A+M data we are placing new emphasis in our meeting schedule on data evaluation and especially on uncertainties in calculated cross section data and the propagation of uncertainties through structure data and fundamental cross sections to effective rate coefficients. Following a recent meeting of the CCN it is intended to use electron scattering on Be, Ne and N2 as exemplars for study of uncertainties and uncertainty propagation in calculated data; this will be discussed further at the presentation. Please see http://www-amdis.iaea.org/CRP/ for more on our active and planned CRPs, which are concerned with atomic processes in core and edge plasma and with plasma interaction with beryllium-based surfaces and with irradiated tungsten.
NASA Astrophysics Data System (ADS)
Lee, K. David; Colony, Mike
2011-06-01
Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.
Smith, Allan W.; Lorentz, Steven R.; Stone, Thomas C.; Datla, Raju V.
2012-01-01
The need to understand and monitor climate change has led to proposed radiometric accuracy requirements for space-based remote sensing instruments that are very stringent and currently outside the capabilities of many Earth orbiting instruments. A major problem is quantifying changes in sensor performance that occur from launch and during the mission. To address this problem on-orbit calibrators and monitors have been developed, but they too can suffer changes from launch and the harsh space environment. One solution is to use the Moon as a calibration reference source. Already the Moon has been used to remove post-launch drift and to cross-calibrate different instruments, but further work is needed to develop a new model with low absolute uncertainties capable of climate-quality absolute calibration of Earth observing instruments on orbit. To this end, we are proposing an Earth-based instrument suite to measure the absolute lunar spectral irradiance to an uncertainty1 of 0.5 % (k=1) over the spectral range from 320 nm to 2500 nm with a spectral resolution of approximately 0.3 %. Absolute measurements of lunar radiance will also be acquired to facilitate calibration of high spatial resolution sensors. The instruments will be deployed at high elevation astronomical observatories and flown on high-altitude balloons in order to mitigate the effects of the Earth’s atmosphere on the lunar observations. Periodic calibrations using instrumentation and techniques available from NIST will ensure traceability to the International System of Units (SI) and low absolute radiometric uncertainties. PMID:26900523
Smith, Allan W; Lorentz, Steven R; Stone, Thomas C; Datla, Raju V
2012-01-01
The need to understand and monitor climate change has led to proposed radiometric accuracy requirements for space-based remote sensing instruments that are very stringent and currently outside the capabilities of many Earth orbiting instruments. A major problem is quantifying changes in sensor performance that occur from launch and during the mission. To address this problem on-orbit calibrators and monitors have been developed, but they too can suffer changes from launch and the harsh space environment. One solution is to use the Moon as a calibration reference source. Already the Moon has been used to remove post-launch drift and to cross-calibrate different instruments, but further work is needed to develop a new model with low absolute uncertainties capable of climate-quality absolute calibration of Earth observing instruments on orbit. To this end, we are proposing an Earth-based instrument suite to measure the absolute lunar spectral irradiance to an uncertainty(1) of 0.5 % (k=1) over the spectral range from 320 nm to 2500 nm with a spectral resolution of approximately 0.3 %. Absolute measurements of lunar radiance will also be acquired to facilitate calibration of high spatial resolution sensors. The instruments will be deployed at high elevation astronomical observatories and flown on high-altitude balloons in order to mitigate the effects of the Earth's atmosphere on the lunar observations. Periodic calibrations using instrumentation and techniques available from NIST will ensure traceability to the International System of Units (SI) and low absolute radiometric uncertainties.
Development of a Digital Aquifer Permeability Map for the ...
Researchers at the U.S. Environmental Protection Agency’s Western Ecology Division have been developing hydrologic landscape maps for selected U.S. states in an effort to create a method to identify the intrinsic watershed attributes of landscapes in regions with little data. Each hydrologic landscape unit is assigned a categorical value from five key indices of macro-scale hydrologic behavior, including annual climate, climate seasonality, aquifer permeability, terrain, and soil permeability. The aquifer permeability index requires creation of a from-scratch dataset for each state. The permeability index for the Pacific Southwest (California, Nevada, and Arizona) expands and modifies the permeability index for the Pacific Northwest (Oregon, Washington, and Idaho), which preceded it. The permeability index was created by assigning geologic map units to one of 18 categories with presumed similar values of permeability to create a hydrolithologic map. The hydrolithologies were then further categorized into permeability index classifications of high, low, unknown and surface water. Unconsolidated, carbonate, volcanic, and undifferentiated units are classified more conservatively to better address uncertainty in source data. High vs. low permeability classifications are assigned qualitatively but follow a threshold guideline of 8.5x10-2 m/day hydraulic conductivity. Estimates of permeability from surface lithology is the current best practice for broad-sca
Quadratic stabilisability of multi-agent systems under switching topologies
NASA Astrophysics Data System (ADS)
Guan, Yongqiang; Ji, Zhijian; Zhang, Lin; Wang, Long
2014-12-01
This paper addresses the stabilisability of multi-agent systems (MASs) under switching topologies. Necessary and/or sufficient conditions are presented in terms of graph topology. These conditions explicitly reveal how the intrinsic dynamics of the agents, the communication topology and the external control input affect stabilisability jointly. With the appropriate selection of some agents to which the external inputs are applied and the suitable design of neighbour-interaction rules via a switching topology, an MAS is proved to be stabilisable even if so is not for each of uncertain subsystem. In addition, a method is proposed to constructively design a switching rule for MASs with norm-bounded time-varying uncertainties. The switching rules designed via this method do not rely on uncertainties, and the switched MAS is quadratically stabilisable via decentralised external self-feedback for all uncertainties. With respect to applications of the stabilisability results, the formation control and the cooperative tracking control are addressed. Numerical simulations are presented to demonstrate the effectiveness of the proposed results.
NASA Technical Reports Server (NTRS)
Sepka, Steve; Vander Kam, Jeremy; McGuire, Kathy
2018-01-01
The Orion Thermal Protection System (TPS) margin process uses a root-sum-square approach with branches addressing trajectory, aerothermodynamics, and material response uncertainties in ablator thickness design. The material response branch applies a bond line temperature reduction between the Avcoat ablator and EA9394 adhesive by 60 C (108 F) from its peak allowed value of 260 C (500 F). This process is known as the Bond Line Temperature Material Margin (BTMM) and is intended to cover material property and performance uncertainties. The value of 60 C (108 F) is a constant, applied at any spacecraft body location and for any trajectory. By varying only material properties in a random (monte carlo) manner, the perl-based script mcCHAR is used to investigate the confidence interval provided by the BTMM. In particular, this study will look at various locations on the Orion heat shield forebody for a guided and an abort (ballistic) trajectory.
NASA Technical Reports Server (NTRS)
Sepka, Steven A.; McGuire, Mary Kathleen; Vander Kam, Jeremy C.
2018-01-01
The Orion Thermal Protection System (TPS) margin process uses a root-sum-square approach with branches addressing trajectory, aerothermodynamics, and material response uncertainties in ablator thickness design. The material response branch applies a bondline temperature reduction between the Avcoat ablator and EA9394 adhesive by 60 C (108 F) from its peak allowed value of 260 C (500 F). This process is known as the Bond Line Temperature Material Margin (BTMM) and is intended to cover material property and performance uncertainties. The value of 60 C (108 F) is a constant, applied at any spacecraft body location and for any trajectory. By varying only material properties in a random (monte carlo) manner, the perl-based script mcCHAR is used to investigate the confidence interval provided by the BTMM. In particular, this study will look at various locations on the Orion heat shield forebody for a guided and an abort (ballistic) trajectory.
Unique geologic insights from "non-unique" gravity and magnetic interpretation
Saltus, R.W.; Blakely, R.J.
2011-01-01
Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are always possible. The rigorous mathematical label of "nonuniqueness" can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this article is to present a practical perspective on the theoretical non-uniqueness of potential-field interpretation in geology. There are multiple ways to approach and constrain potential-field studies to produce significant, robust, and definitive results. The "non-uniqueness" of potential-field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.
Climate change risk perception and communication: addressing a critical moment?
Pidgeon, Nick
2012-06-01
Climate change is an increasingly salient issue for societies and policy-makers worldwide. It now raises fundamental interdisciplinary issues of risk and uncertainty analysis and communication. The growing scientific consensus over the anthropogenic causes of climate change appears to sit at odds with the increasing use of risk discourses in policy: for example, to aid in climate adaptation decision making. All of this points to a need for a fundamental revision of our conceptualization of what it is to do climate risk communication. This Special Collection comprises seven papers stimulated by a workshop on "Climate Risk Perceptions and Communication" held at Cumberland Lodge Windsor in 2010. Topics addressed include climate uncertainties, images and the media, communication and public engagement, uncertainty transfer in climate communication, the role of emotions, localization of hazard impacts, and longitudinal analyses of climate perceptions. Climate change risk perceptions and communication work is critical for future climate policy and decisions. © 2012 Society for Risk Analysis.
On the formulation of a minimal uncertainty model for robust control with structured uncertainty
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert
1991-01-01
In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix, delta, and constructing the state-space representation of P(s). Three examples are presented to illustrate the procedure.
Comparison of methods for measuring atmospheric deposition of arsenic, cadmium, nickel and lead.
Aas, Wenche; Alleman, Laurent Y; Bieber, Elke; Gladtke, Dieter; Houdret, Jean-Luc; Karlsson, Vuokko; Monies, Christian
2009-06-01
A comprehensive field intercomparison at four different types of European sites (two rural, one urban and one industrial) comparing three different collectors (wet only, bulk and Bergerhoff samplers) was conducted in the framework of the European Committee for Standardization (CEN) to create an European standard for the deposition of the four elements As, Cd, Ni and Pb. The purpose was to determine whether the proposed methods lead to results within the uncertainty required by the EU's daughter directive (70%). The main conclusion is that a different sampling strategy is needed for rural and industrial sites. Thus, the conclusions on uncertainties and sample approach are presented separately for the different approaches. The wet only and bulk collector ("bulk bottle method") are comparable at wet rural sites where the total deposition arises mainly from precipitation, the expanded uncertainty when comparing these two types of sampler are below 45% for As, Cd and Pb, 67% for Ni. At industrial sites and possibly very dry rural and urban sites it is necessary to use Bergerhoff samplers or a "bulk bottle+funnel method". It is not possible to address the total deposition estimation with these methods, but they will give the lowest estimate of the total deposition. The expanded uncertainties when comparing the Bergerhoff and the bulk bottle+funnel methods are below 50% for As and Cd, and 63% for Pb. The uncertainty for Ni was not addressed since the bulk bottle+funnel method did not include a full digestion procedure which is necessary for sites with high loads of undissolved metals. The lowest estimate can however be calculated by comparing parallel Bergerhoff samplers where the expanded uncertainty for Ni was 24%. The reproducibility is comparable to the between sampler/method uncertainties. Sampling and sample preparation were proved to be the main factors in the uncertainty budget of deposition measurements.
Calibration of Speed Enforcement Down-The-Road Radars
Jendzurski, John; Paulter, Nicholas G.
2009-01-01
We examine the measurement uncertainty associated with different methods of calibrating the ubiquitous down-the-road (DTR) radar used in speed enforcement. These calibration methods include the use of audio frequency sources, tuning forks, a fifth wheel attached to the rear of the vehicle with the radar unit, and the speedometer of the vehicle. We also provide an analysis showing the effect of calibration uncertainty on DTR-radar speed measurement uncertainty. PMID:27504217
Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald
2014-01-01
The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315
Methods for exploring uncertainty in groundwater management predictions
Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew
2016-01-01
Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.
1993 Intercomparison of Photometric Units Maintained at NIST (USA) and PTB (Germany)
Ohno, Yoshihiro; Sauter, Georg
1995-01-01
A bilateral intercomparison of photometric units between NIST, USA and PTB, Germany has been conducted to update the knowledge of the relationship between the photometric units disseminated in each country. The luminous intensity unit (cd) and the luminous flux unit (lm) maintained at both laboratories are compared by circulating transfer standard lamps. Also, the photometric responsivity sv is compared by circulating a V(λ)-corrected detector with a built-in current-to-voltage converter. The results show that the difference of luminous intensity unit between NIST and PTB, (PTB-NIST)/NIST, is 0.2 % with a relative expanded uncertainty (coverage factor k = 2) of 0.24 %. The difference is reduced significantly from that at the 1985 CCPR intercomparison (0.9 %). The difference in luminous flux unit, (PTB – NIST)/NIST, is found to be 1.5 % with a relative expanded uncertainty (coverage factor k =2) of 0.15 %. The difference remained nearly the same as that at the 1985 intercomparison (1.6 %). These results agree with what is predicted from the history of maintaining the units at each laboratory. PMID:29151737
Stochastic Robust Mathematical Programming Model for Power System Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Cong; Changhyeok, Lee; Haoyong, Chen
2016-01-01
This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.
NASA Astrophysics Data System (ADS)
Parada, M.; Sbarbaro, D.; Borges, R. A.; Peres, P. L. D.
2017-01-01
The use of robust design techniques such as the one based on ? and ? for tuning proportional integral (PI) and proportional integral derivative (PID) controllers have been limited to address a small set of processes. This work addresses the problem by considering a wide set of possible plants, both first- and second-order continuous-time systems with time delays and zeros, leading to PI and PID controllers. The use of structured uncertainties to handle neglected dynamics allows to expand the range of processes to be considered. The proposed approach takes into account the robustness of the controller with respect to these structured uncertainties by using the small-gain theorem. In addition, improved performance is sought through the minimisation of an upper bound to the closed-loop system ? norm. A Lyapunov-Krasovskii-type functional is used to obtain delay-dependent design conditions. The controller design is accomplished by means of a convex optimisation procedure formulated using linear matrix inequalities. In order to illustrate the flexibility of the approach, several examples considering recycle compensation, reduced-order controller design and a practical implementation are addressed. Numerical experiments are provided in each case to highlight the main characteristics of the proposed design method.
Maas-Hebner, Kathleen G.; Schreck, Carl B.; Hughes, Robert M.; Yeakley, Alan; Molina, Nancy
2016-01-01
We discuss the importance of addressing diffuse threats to long-term species and habitat viability in fish conservation and recovery planning. In the Pacific Northwest, USA, salmonid management plans have typically focused on degraded freshwater habitat, dams, fish passage, harvest rates, and hatchery releases. However, such plans inadequately address threats related to human population and economic growth, intra- and interspecific competition, and changes in climate, ocean, and estuarine conditions. Based on reviews conducted on eight conservation and/or recovery plans, we found that though threats resulting from such changes are difficult to model and/or predict, they are especially important for wide-ranging diadromous species. Adaptive management is also a critical but often inadequately constructed component of those plans. Adaptive management should be designed to respond to evolving knowledge about the fish and their supporting ecosystems; if done properly, it should help improve conservation efforts by decreasing uncertainty regarding known and diffuse threats. We conclude with a general call for environmental managers and planners to reinvigorate the adaptive management process in future management plans, including more explicitly identifying critical uncertainties, implementing monitoring programs to reduce those uncertainties, and explicitly stating what management actions will occur when pre-identified trigger points are reached.
Addressing Risk in the Valuation of Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Hammerstrom, Donald J.; Woodward, James T.
2017-06-26
Valuation is a mechanism by which potential worth of a transaction between two or more parties can be evaluated. Examples include valuation of transactive energy systems such as electric power system and building energy systems. Uncertainties can manifest while exercising a valuation methodology in the form of lack of knowledge or be inherently embedded in the valuation process. Uncertainty could also exist in the temporal dimension while planning for long-term growth. This paper discusses risk considerations associated with valuation studies in support of decision-making in the presence of such uncertainties. It is often important to have foresight of uncertain entitiesmore » that can impact real-world deployments, such as the comparison or ranking of two valuation studies to determine cost-benefit impacts to multiple stakeholders. The research proposes to address this challenge through simulation and sensitivity analyses to support ‘what-if’ analysis of well-defined future scenarios. This paper describes foundational value of diagrammatic representation techniques such as unified modeling language to understand the implications of not addressing some of the risk elements encountered during the valuation process. The paper includes examples from generation resource adequacy assessment studies (e.g. loss of load) to illustrate the principles of risk in valuation.« less
Patricia L. Winter; Heidi Bigler-Cole
2010-01-01
Making complex risk-related decisions involves a degree of uncertainty. How that uncertainty is addressed or presented in reports or data tables can be tailored to meet information usersâ needs and preferences. Involving the recipients of risk-related information in the design of information to be delivered (including the types of information delivered, format, and...
Patricia L. Winter; Heidi Bigler-Cole
2010-01-01
Making complex risk-related decisions involves a degree of uncertainty. How that uncertainty is addressed or presented in reports or data tables can be tailored to meet information usersâ needs and preferences. Involving the recipients of risk-related information in the design of information to be delivered (including the types of information delivered, format, and...
Paige F. B. Ferguson; Michael J. Conroy; John F. Chamblee; Jeffrey Hepinstall-Cymerman
2015-01-01
Parcelization and forest fragmentation are of concern for ecological, economic, and social reasons. Efforts to keep large, private forests intact may be supported by a decision-making process that incorporates landownersâ objectives and uncertainty. We used structured decision making (SDM) with owners of large, private forests in Macon County, North Carolina....
Synthesizing spatiotemporally sparse smartphone sensor data for bridge modal identification
NASA Astrophysics Data System (ADS)
Ozer, Ekin; Feng, Maria Q.
2016-08-01
Smartphones as vibration measurement instruments form a large-scale, citizen-induced, and mobile wireless sensor network (WSN) for system identification and structural health monitoring (SHM) applications. Crowdsourcing-based SHM is possible with a decentralized system granting citizens with operational responsibility and control. Yet, citizen initiatives introduce device mobility, drastically changing SHM results due to uncertainties in the time and the space domains. This paper proposes a modal identification strategy that fuses spatiotemporally sparse SHM data collected by smartphone-based WSNs. Multichannel data sampled with the time and the space independence is used to compose the modal identification parameters such as frequencies and mode shapes. Structural response time history can be gathered by smartphone accelerometers and converted into Fourier spectra by the processor units. Timestamp, data length, energy to power conversion address temporal variation, whereas spatial uncertainties are reduced by geolocation services or determining node identity via QR code labels. Then, parameters collected from each distributed network component can be extended to global behavior to deduce modal parameters without the need of a centralized and synchronous data acquisition system. The proposed method is tested on a pedestrian bridge and compared with a conventional reference monitoring system. The results show that the spatiotemporally sparse mobile WSN data can be used to infer modal parameters despite non-overlapping sensor operation schedule.
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BABA,T.; ISHIGURO,K.; ISHIHARA,Y.
1999-08-30
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less
A taxonomy of medical uncertainties in clinical genome sequencing.
Han, Paul K J; Umstead, Kendall L; Bernhardt, Barbara A; Green, Robert C; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B; Biesecker, Leslie G; Biesecker, Barbara B
2017-08-01
Clinical next-generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of an unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts and themes were extracted in order to expand on a previously published three-dimensional taxonomy of medical uncertainty. In parallel, we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. The proposed taxonomy divides uncertainty along three axes-source, issue, and locus-and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS.Genet Med advance online publication 19 January 2017.
A Taxonomy of Medical Uncertainties in Clinical Genome Sequencing
Han, Paul K. J.; Umstead, Kendall L.; Bernhardt, Barbara A.; Green, Robert C.; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B.; Biesecker, Leslie G.; Biesecker, Barbara B.
2017-01-01
Purpose Clinical next generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Methods Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts, and themes were extracted in order to expand upon a previously published three-dimensional taxonomy of medical uncertainty. In parallel we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. Results The proposed taxonomy divides uncertainty along three axes: source, issue, and locus, and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. Conclusion The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS. PMID:28102863
NASA Technical Reports Server (NTRS)
Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael
2012-01-01
There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.
The Social Construction of Uncertainty in Healthcare Delivery
NASA Astrophysics Data System (ADS)
Begun, James W.; Kaissi, Amer A.
We explore the following question: How would healthcare delivery be different if uncertainty were widely recognized, accurately diagnosed, and appropriately managed? Unlike most studies of uncertainty, we examine uncertainty at more than one level of analysis, considering uncertainty that arises at the patient-clinician interaction level and at the organizational level of healthcare delivery. We consider the effects of history, as the forces and systems that currently shape and manage uncertainty have emerged over a long time period. The purpose of this broad and speculative "thought exercise" is to generate greater sensemaking of the current state of healthcare delivery, particularly in the realm of organizational and public policy, and to generate new research questions about healthcare delivery. The discussion is largely based on experience in the United States, which may limit its generalizability.
Wu, Jianyong; Zhou, Ying; Gao, Yang; Fu, Joshua S.; Johnson, Brent A.; Huang, Cheng; Kim, Young-Min
2013-01-01
Background: Climate change is anticipated to influence heat-related mortality in the future. However, estimates of excess mortality attributable to future heat waves are subject to large uncertainties and have not been projected under the latest greenhouse gas emission scenarios. Objectives: We estimated future heat wave mortality in the eastern United States (approximately 1,700 counties) under two Representative Concentration Pathways (RCPs) and investigated sources of uncertainty. Methods: Using dynamically downscaled hourly temperature projections for 2057–2059, we projected heat wave days that were defined using four heat wave metrics and estimated the excess mortality attributable to them. We apportioned the sources of uncertainty in excess mortality estimates using a variance-decomposition method. Results: Estimates suggest that excess mortality attributable to heat waves in the eastern United States would result in 200–7,807 deaths/year (mean 2,379 deaths/year) in 2057–2059. Average excess mortality projections under RCP4.5 and RCP8.5 scenarios were 1,403 and 3,556 deaths/year, respectively. Excess mortality would be relatively high in the southern states and eastern coastal areas (excluding Maine). The major sources of uncertainty were the relative risk estimates for mortality on heat wave versus non–heat wave days, the RCP scenarios, and the heat wave definitions. Conclusions: Mortality risks from future heat waves may be an order of magnitude higher than the mortality risks reported in 2002–2004, with thousands of heat wave–related deaths per year in the study area projected under the RCP8.5 scenario. Substantial spatial variability in county-level heat mortality estimates suggests that effective mitigation and adaptation measures should be developed based on spatially resolved data. Citation: Wu J, Zhou Y, Gao Y, Fu JS, Johnson BA, Huang C, Kim YM, Liu Y. 2014. Estimation and uncertainty analysis of impacts of future heat waves on mortality in the eastern United States. Environ Health Perspect 122:10–16; http://dx.doi.org/10.1289/ehp.1306670 PMID:24192064
How Navigating Uncertainty Motivates Trust in Medicine.
Imber, Jonathan B
2017-04-01
Three significant factors in the shaping of modern medicine contribute to broad perceptions about trust in the patient-physician relationship: moral, professional, and epidemiological uncertainty. Trusting a physician depends first on trusting a person, then trusting a person's skills and training, and finally trusting the science that underwrites those skills. This essay, in part based on my book, Trusting Doctors: The Decline of Moral Authority in American Medicine (Princeton University Press, 2008), will address the forms of uncertainty that contribute to the nature of difficult encounters in the patient-physician relationship. © 2017 American Medical Association. All Rights Reserved.
ERIC Educational Resources Information Center
Yorks, Lyle; Nicolaides, Aliki
2012-01-01
This article addresses an important, yet often underattended to, aspect of the strategy development process: fostering the use of strategic learning practices in the simultaneous practice of developing strategy and cultivating strategic mindset awareness. The need for addressing this aspect of the strategy development process is increasingly…
The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental de...
The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental dec...
Analyzing Uncertainty and Risk in the Management of Water Resources in the State Of Texas
NASA Astrophysics Data System (ADS)
Singh, A.; Hauffpauir, R.; Mishra, S.; Lavenue, M.
2010-12-01
The State of Texas updates its state water plan every five years to determine the water demand required to meet its growing population. The plan compiles forecasts of water deficits from state-wide regional water planning groups as well as the water supply strategies to address these deficits. To date, the plan has adopted a deterministic framework, where reference values (e.g., best estimates, worst-case scenario) are used for key factors such as population growth, demand for water, severity of drought, water availability, etc. These key factors can, however, be affected by multiple sources of uncertainties such as - the impact of climate on surface water and groundwater availability, uncertainty in population projections, changes in sectoral composition of the economy, variability in water usage, feasibility of the permitting process, cost of implementation, etc. The objective of this study was to develop a generalized and scalable methodology for addressing uncertainty and risk in water resources management both at the regional and the local water planning level. The study proposes a framework defining the elements of an end-to-end system model that captures the key components of demand, supply and planning modules along with their associated uncertainties. The framework preserves the fundamental elements of the well-established planning process in the State of Texas, promoting an incremental and stakeholder-driven approach to adding different levels of uncertainty (and risk) into the decision-making environment. The uncertainty in the water planning process is broken down into two primary categories: demand uncertainty and supply uncertainty. Uncertainty in Demand is related to the uncertainty in population projections and the per-capita usage rates. Uncertainty in Supply, in turn, is dominated by the uncertainty in future climate conditions. Climate is represented in terms of time series of precipitation, temperature and/or surface evaporation flux for some future time period of interest, which can be obtained as outputs of global climate models (GCMs). These are then linked with hydrologic and water-availability models (WAMs) to estimate water availability for the worst drought conditions under each future climate scenario. Combining the demand scenarios with the water availability scenarios yields multiple scenarios for water shortage (or surplus). Given multiple shortage/surplus scenarios, various water management strategies can be assessed to evaluate the reliability of meeting projected deficits. These reliabilities are then used within a multi-criteria decision-framework to assess trade-offs between various water management objectives, thus helping to make more robust decisions while planning for the water needs of the future.
Facing uncertainty in ecosystem services-based resource management.
Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter
2013-09-01
The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Ambiguity and Uncertainty in Probabilistic Inference.
1983-09-01
whether one was to judge the like- lihood that the majority or minority position was true . In order to sample a wide range of values of n and p, 40...AFD-A133 418 AMBIGUITY AND UNCERTAINTY IN PROBABILISTIC INFERENCE i/i U CLRS (U) CHICGO UNIT’ IL CENTER FOR DECISION RESERCH H J EINHORN ET AL. SEP...been demonstrated experimentally (Becker & Brownson, 1964; Yates & Zukowski, 1976). On the other hand, the process by which such second-order uncertainty
The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,
2005-01-01
The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.
Matching soil grid unit resolutions with polygon unit scales for DNDC modelling of regional SOC pool
NASA Astrophysics Data System (ADS)
Zhang, H. D.; Yu, D. S.; Ni, Y. L.; Zhang, L. M.; Shi, X. Z.
2015-03-01
Matching soil grid unit resolution with polygon unit map scale is important to minimize uncertainty of regional soil organic carbon (SOC) pool simulation as their strong influences on the uncertainty. A series of soil grid units at varying cell sizes were derived from soil polygon units at the six map scales of 1:50 000 (C5), 1:200 000 (D2), 1:500 000 (P5), 1:1 000 000 (N1), 1:4 000 000 (N4) and 1:14 000 000 (N14), respectively, in the Tai lake region of China. Both format soil units were used for regional SOC pool simulation with DeNitrification-DeComposition (DNDC) process-based model, which runs span the time period 1982 to 2000 at the six map scales, respectively. Four indices, soil type number (STN) and area (AREA), average SOC density (ASOCD) and total SOC stocks (SOCS) of surface paddy soils simulated with the DNDC, were attributed from all these soil polygon and grid units, respectively. Subjecting to the four index values (IV) from the parent polygon units, the variation of an index value (VIV, %) from the grid units was used to assess its dataset accuracy and redundancy, which reflects uncertainty in the simulation of SOC. Optimal soil grid unit resolutions were generated and suggested for the DNDC simulation of regional SOC pool, matching with soil polygon units map scales, respectively. With the optimal raster resolution the soil grid units dataset can hold the same accuracy as its parent polygon units dataset without any redundancy, when VIV < 1% of all the four indices was assumed as criteria to the assessment. An quadratic curve regression model y = -8.0 × 10-6x2 + 0.228x + 0.211 (R2 = 0.9994, p < 0.05) was revealed, which describes the relationship between optimal soil grid unit resolution (y, km) and soil polygon unit map scale (1:x). The knowledge may serve for grid partitioning of regions focused on the investigation and simulation of SOC pool dynamics at certain map scale.
Evaluating the uncertainty of input quantities in measurement models
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Elster, Clemens
2014-06-01
The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.
Defining the measurand in radius of curvature measurements
NASA Astrophysics Data System (ADS)
Davies, Angela; Schmitz, Tony L.
2003-11-01
Traceable radius of curvature measurements are critical for precision optics manufacture. An optical bench measurement of radius is very repeatable and is the preferred method for low-uncertainty applications. On an optical bench, the displacement of the optic is measured as it is moved between the cat's eye and confocal positions, each identified using a figure measuring interferometer. Traceability requires connection to a basic unit (the meter, here) in addition to a defensible uncertainty analysis, and the identification and proper propagation of all uncertainty sources in this measurement is challenging. Recent work has focused on identifying all uncertainty contributions; measurement biases have been approximately taken into account and uncertainties combined in an RSS sense for a final measurement estimate and uncertainty. In this paper we report on a new mathematical definition of the radius measurand, which is a single function that depends on all uncertainty sources, such as error motions, alignment uncertainty, displacement gauge uncertainty, etc. The method is based on a homogeneous transformation matrix (HTM) formalism, and intrinsically defines an unbiased estimate for radius, providing a single mathematical expression for uncertainty propagation through a Taylor-series expansion.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; de Moel, H.
2016-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage functions and maximum damages can have large effects on flood damage estimates. This explanation is then used to quantify the uncertainty in the damage estimates with a Monte Carlo analysis. The Monte Carlo analysis uses a damage function library with 272 functions from seven different flood damage models. The paper shows that the resulting uncertainties in estimated damages are in the order of magnitude of a factor of 2 to 5. The uncertainty is typically larger for flood events with small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
How to find what you don't know: Visualising variability in 3D geological models
NASA Astrophysics Data System (ADS)
Lindsay, Mark; Wellmann, Florian; Jessell, Mark; Ailleres, Laurent
2014-05-01
Uncertainties in input data can have compounding effects on the predictive reliability of three-dimensional (3D) geological models. Resource exploration, tectonic studies and environmental modelling can be compromised by using 3D models that misrepresent the target geology, and drilling campaigns that attempt to intersect particular geological units guided by 3D models are at risk of failure if the exploration geologist is unaware of inherent uncertainties. In addition, the visual inspection of 3D models is often the first contact decision makers have with the geology, thus visually communicating the presence and magnitude of uncertainties contained within geological 3D models is critical. Unless uncertainties are presented early in the relationship between decision maker and model, the model will be considered more truthful than the uncertainties allow with each subsequent viewing. We present a selection of visualisation techniques that provide the viewer with an insight to the location and amount of uncertainty contained within a model, and the geological characteristics which are most affected. A model of the Gippsland Basin, southeastern Australia is used as a case study to demonstrate the concepts of information entropy, stratigraphic variability and geodiversity. Central to the techniques shown here is the creation of a model suite, performed by creating similar (but not the same) version of the original model through perturbation of the input data. Specifically, structural data in the form of strike and dip measurements is perturbed in the creation of the model suite. The visualisation techniques presented are: (i) information entropy; (ii) stratigraphic variability and (iii) geodiversity. Information entropy is used to analyse uncertainty in a spatial context, combining the empirical probability distributions of multiple outcomes with a single quantitative measure. Stratigraphic variability displays the number of possible lithologies that may exist at a given point within the model volume. Geodiversity analyses various model characteristics (or 'geodiveristy metrics'), including the depth, volume of unit, the curvature of an interface, the geological complexity of a contact and the contact relationships units have with each other. Principal component analysis, a multivariate statistical technique, is used to simultaneously examine each of the geodiveristy metrics to determine the boundaries of model space, and identify which metrics contribute most to model uncertainty. The combination of information entropy, stratigraphic variability and geodiversity analysis provides a descriptive and thorough representation of uncertainty with effective visualisation techniques that clearly communicate the geological uncertainty contained within the geological model.
Assessment of aerodynamic performance of V/STOL and STOVL fighter aircraft
NASA Technical Reports Server (NTRS)
Nelms, W. P.
1984-01-01
The aerodynamic performance of V/STOL and STOVL fighter/attack aircraft was assessed. Aerodynamic and propulsion/airframe integration activities are described and small and large scale research programs are considered. Uncertainties affecting aerodynamic performance that are associated with special configuration features resulting from the V/STOL requirement are addressed. Example uncertainties relate to minimum drag, wave drag, high angle of attack characteristics, and power induced effects.
[The metrology of uncertainty: a study of vital statistics from Chile and Brazil].
Carvajal, Yuri; Kottow, Miguel
2012-11-01
This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.
Polynomial chaos expansion with random and fuzzy variables
NASA Astrophysics Data System (ADS)
Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.
2016-06-01
A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.
The Impact of Uncertain Physical Parameters on HVAC Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Elizondo, Marcelo A.; Lu, Shuai
HVAC units are currently one of the major resources providing demand response (DR) in residential buildings. Models of HVAC with DR function can improve understanding of its impact on power system operations and facilitate the deployment of DR technologies. This paper investigates the importance of various physical parameters and their distributions to the HVAC response to DR signals, which is a key step to the construction of HVAC models for a population of units with insufficient data. These parameters include the size of floors, insulation efficiency, the amount of solid mass in the house, and efficiency of the HVAC units.more » These parameters are usually assumed to follow Gaussian or Uniform distributions. We study the effect of uncertainty in the chosen parameter distributions on the aggregate HVAC response to DR signals, during transient phase and in steady state. We use a quasi-Monte Carlo sampling method with linear regression and Prony analysis to evaluate sensitivity of DR output to the uncertainty in the distribution parameters. The significance ranking on the uncertainty sources is given for future guidance in the modeling of HVAC demand response.« less
NASA Astrophysics Data System (ADS)
Lark, R. Murray
2014-05-01
Conventionally the uncertainty of a conventional soil map has been expressed in terms of the mean purity of its map units: the probability that the soil profile class examined at a site would be found to correspond to the eponymous class of the simple map unit that is delineated there (Burrough et al, 1971). This measure of uncertainty has an intuitive meaning and is used for quality control in soil survey contracts (Western, 1978). However, it may be of limited value to the manager or policy maker who wants to decide whether the map provides a basis for decision making, and whether the cost of producing a better map would be justified. In this study I extend a published analysis of the economic implications of uncertainty in a soil map (Giasson et al., 2000). A decision analysis was developed to assess the economic value of imperfect soil map information for agricultural land use planning. Random error matrices for the soil map units were then generated, subject to constraints which ensure consistency with fixed frequencies of the different soil classes. For each error matrix the mean map unit purity was computed, and the value of the implied imperfect soil information was computed by the decision analysis. An alternative measure of the uncertainty in a soil map was considered. This is the mean soil map information which is the difference between the information content of a soil observation, at a random location in the region, and the information content of a soil observation given that the map unit is known. I examined the relationship between the value of imperfect soil information and the purity and information measures of map uncertainty. In both cases there was considerable variation in the economic value of possible maps with fixed values of the uncertainty measure. However, the correlation was somewhat stronger with the information measure, and there was a clear upper bound on the value of an imperfect soil map when the mean information takes some particular value. This suggests that the information measure may be a useful one for general communication of the value of soil and similar thematic data. Burrough, P.A., Beckett, P.H.T., Jarvis, M.G., 1971. The relation between cost and utility in soil survey. J. Soil Sci. 22, 359-394. Giasson, E., van Es, C, van Wambeke, A., Bryant, R.B. 2000. Assessing the economic value of soil information using decision analysis techniques. Soil Science 165, 971-978 Western, S., 1978. Soil survey contracts and quality control. Oxford Univ. Press, Oxford.
Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes
NASA Astrophysics Data System (ADS)
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.
2017-12-01
Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.
Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N
This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less
Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N
This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less
Aeroservoelastic Uncertainty Model Identification from Flight Data
NASA Technical Reports Server (NTRS)
Brenner, Martin J.
2001-01-01
Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.
NASA Astrophysics Data System (ADS)
Hobbs, J.; Turmon, M.; David, C. H.; Reager, J. T., II; Famiglietti, J. S.
2017-12-01
NASA's Western States Water Mission (WSWM) combines remote sensing of the terrestrial water cycle with hydrological models to provide high-resolution state estimates for multiple variables. The effort includes both land surface and river routing models that are subject to several sources of uncertainty, including errors in the model forcing and model structural uncertainty. Computational and storage constraints prohibit extensive ensemble simulations, so this work outlines efficient but flexible approaches for estimating and reporting uncertainty. Calibrated by remote sensing and in situ data where available, we illustrate the application of these techniques in producing state estimates with associated uncertainties at kilometer-scale resolution for key variables such as soil moisture, groundwater, and streamflow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, K T
A relatively simple, quantitative approach is proposed to address a specific, important gap in the appr approach recommended by the USEPA Guidelines for Cancer Risk Assessment to oach address uncertainty in carcinogenic mode of action of certain chemicals when risk is extrapolated from bioassay data. These Guidelines recognize that some chemical carcinogens may have a site-specific mode of action (MOA) that is dual, involving mutation in addition to cell-killing induced hyperplasia. Although genotoxicity may contribute to increased risk at all doses, the Guidelines imply that for dual MOA (DMOA) carcinogens, judgment be used to compare and assess results obtained usingmore » separate 'linear' (genotoxic) vs. 'nonlinear' (nongenotoxic) approaches to low low-level risk extrapolation. However, the Guidelines allow the latter approach to be used only when evidence is sufficient t to parameterize a biologically based model that reliably o extrapolates risk to low levels of concern. The Guidelines thus effectively prevent MOA uncertainty from being characterized and addressed when data are insufficient to parameterize such a model, but otherwise clearly support a DMOA. A bounding factor approach - similar to that used in reference dose procedures for classic toxicity endpoints - can address MOA uncertainty in a way that avoids explicit modeling of low low-dose risk as a function of administere administered or internal dose. Even when a 'nonlinear' toxicokinetic model cannot be fully validated, implications of DMOA uncertainty on low low-dose risk may be bounded with reasonable confidence when target tumor types happen to be extremely rare. This concept was i illustrated llustrated for a likely DMOA rodent carcinogen naphthalene, specifically to the issue of risk extrapolation from bioassay data on naphthalene naphthalene-induced nasal tumors in rats. Bioassay data, supplemental toxicokinetic data, and related physiologically based p pharmacokinetic and 2 harmacokinetic 2-stage stochastic carcinogenesis modeling results all clearly indicate that naphthalene is a DMOA carcinogen. Plausibility bounds on rat rat-tumor tumor-type specific DMOA DMOA-related uncertainty were obtained using a 2-stage model adapted to reflec reflect the empirical link between genotoxic and cytotoxic effects of t the most potent identified genotoxic naphthalene metabolites, 1,2 1,2- and 1,4 1,4-naphthoquinone. Bound Bound-specific 'adjustment' factors were then used to reduce naphthalene risk estimated by linear ex extrapolation (under the default genotoxic MOA assumption), to account for the DMOA trapolation exhibited by this compound.« less
Vaughan, Adam S; Kramer, Michael R; Waller, Lance A; Schieb, Linda J; Greer, Sophia; Casper, Michele
2015-05-01
To demonstrate the implications of choosing analytical methods for quantifying spatiotemporal trends, we compare the assumptions, implementation, and outcomes of popular methods using county-level heart disease mortality in the United States between 1973 and 2010. We applied four regression-based approaches (joinpoint regression, both aspatial and spatial generalized linear mixed models, and Bayesian space-time model) and compared resulting inferences for geographic patterns of local estimates of annual percent change and associated uncertainty. The average local percent change in heart disease mortality from each method was -4.5%, with the Bayesian model having the smallest range of values. The associated uncertainty in percent change differed markedly across the methods, with the Bayesian space-time model producing the narrowest range of variance (0.0-0.8). The geographic pattern of percent change was consistent across methods with smaller declines in the South Central United States and larger declines in the Northeast and Midwest. However, the geographic patterns of uncertainty differed markedly between methods. The similarity of results, including geographic patterns, for magnitude of percent change across these methods validates the underlying spatial pattern of declines in heart disease mortality. However, marked differences in degree of uncertainty indicate that Bayesian modeling offers substantially more precise estimates. Copyright © 2015 Elsevier Inc. All rights reserved.
The Years of Uncertainty: Eighth Grade Family Life Education.
ERIC Educational Resources Information Center
Carson, Mary, Ed.; And Others
The family life sex education unit for eighth graders, "The Years of Uncertainty," consists of a series of daily lesson plans that span a 29-day period of one-hour class sessions. Topics covered are: problem solving, knowledge and attitudes, male and female reproductive systems, conception, pregnancy, birth, birth defects, venereal…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-03
... the status quo. The action is expected to maximize the profitability for the spiny dogfish fishery... possible commercial quotas by not making a deduction from the ACL accounting for management uncertainty...) in 2015; however, not accounting for management uncertainty would have increased the risk of...
The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary
ERIC Educational Resources Information Center
Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul
2012-01-01
Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.
Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this studymore » was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.« less
Accounting for biodiversity in the dairy industry.
Sizemore, Grant C
2015-05-15
Biodiversity is an essential part of properly functioning ecosystems, yet the loss of biodiversity currently occurs at rates unparalleled in the modern era. One of the major causes of this phenomenon is habitat loss and modification as a result of intensified agricultural practices. This paper provides a starting point for considering biodiversity within dairy production, and, although focusing primarily on the United States, findings are applicable broadly. Biodiversity definitions and assessments (e.g., indicators, tools) are proposed and reviewed. Although no single indicator or tool currently meets all the needs of comprehensive assessment, many sustainable practices are readily adoptable as ways to conserve and promote biodiversity. These practices, as well as potential funding opportunities are identified. Given the state of uncertainty in addressing the complex nature of biodiversity assessments, the adoption of generally sustainable environmental practices may be the best currently available option for protecting biodiversity on dairy lands. Copyright © 2015 Elsevier Ltd. All rights reserved.
Technology Assessment in Support of the Presidential Vision for Space Exploration
NASA Technical Reports Server (NTRS)
Weisbin, Charles R.; Lincoln, William; Mrozinski, Joe; Hua, Hook; Merida, Sofia; Shelton, Kacie; Adumitroaie, Virgil; Derleth, Jason; Silberg, Robert
2006-01-01
This paper discusses the process and results of technology assessment in support of the United States Vision for Space Exploration of the Moon, Mars and Beyond. The paper begins by reviewing the Presidential Vision: a major endeavor in building systems of systems. It discusses why we wish to return to the Moon, and the exploration architecture for getting there safely, sustaining a presence, and safely returning. Next, a methodology for optimal technology investment is proposed with discussion of inputs including a capability hierarchy, mission importance weightings, available resource profiles as a function of time, likelihoods of development success, and an objective function. A temporal optimization formulation is offered, and the investment recommendations presented along with sensitivity analyses. Key questions addressed are sensitivity of budget allocations to cost uncertainties, reduction in available budget levels, and shifting funding within constraints imposed by mission timeline.
Life cycle assessment of bioenergy systems: state of the art and future challenges.
Cherubini, Francesco; Strømman, Anders Hammer
2011-01-01
The use of different input data, functional units, allocation methods, reference systems and other assumptions complicates comparisons of LCA bioenergy studies. In addition, uncertainties and use of specific local factors for indirect effects (like land-use change and N-based soil emissions) may give rise to wide ranges of final results. In order to investigate how these key issues have been addressed so far, this work performs a review of the recent bioenergy LCA literature. The abundance of studies dealing with the different biomass resources, conversion technologies, products and environmental impact categories is summarized and discussed. Afterwards, a qualitative interpretation of the LCA results is depicted, focusing on energy balance, GHG balance and other impact categories. With the exception of a few studies, most LCAs found a significant net reduction in GHG emissions and fossil energy consumption when bioenergy replaces fossil energy. Copyright © 2010 Elsevier Ltd. All rights reserved.
Alaska Arctic marine fish ecology catalog
Thorsteinson, Lyman K.; Love, Milton S.
2016-08-08
The marine fishes in waters of the United States north of the Bering Strait have received new and increased scientific attention over the past decade (2005–15) in conjunction with frontier qualities of the region and societal concerns about the effects of Arctic climate change. Commercial fisheries are negligible in the Chukchi and Beaufort Seas, but many marine species have important traditional and cultural values to Alaska Native residents. Although baseline conditions are rapidly changing, effective decisions about research and monitoring investments must be based on reliable information and plausible future scenarios. For the first time, this synthesis presents a comprehensive evaluation of the marine fish fauna from both seas in a single reference. Although many unknowns and uncertainties remain in the scientific understanding, information presented here is foundational with respect to understanding marine ecosystems and addressing dual missions of the U.S. Department of the Interior for energy development and resource conservation.
Fuzzy set methods for object recognition in space applications
NASA Technical Reports Server (NTRS)
Keller, James M.
1991-01-01
During the reporting period, the development of the theory and application of methodologies for decision making under uncertainty was addressed. Two subreports are included; the first on properties of general hybrid operators, while the second considers some new research on generalized threshold logic units. In the first part, the properties of the additive gamma-model, where the intersection part is first considered to be the product of the input values and the union part is obtained by an extension of De Morgan's law to fuzzy sets, is explored. Then the Yager's class of union and intersection is used in the additive gamma-model. The inputs are weighted to some power that represents their importance and thus their contribution to the compensation process. In the second part, the extension of binary logic synthesis methods to multiple valued logic synthesis methods to enable the synthesis of decision networks when the input/output variables are not binary is discussed.
Advance directives in the UK: legal, ethical, and practical considerations for doctors.
Kessel, A S; Meran, J
1998-05-01
In the United Kingdom (UK), advance directives have recently received considerable attention from professional and voluntary organizations as well as medical journals and the media. However, despite such exposure, many doctors remain uncertain of the importance or relevance of advance directives with regard to their own clinical practice. This paper addresses these uncertainties by first explaining what advance directives are and then describing the current legal status of such directives in the UK. Examination of the cases underpinning this status reveals several key elements: competence, information, anticipation, applicability, and freedom from duress. Each is discussed. Although this paper focuses on legal issues, it is important that medical law does not dominate medical ethics. Accordingly, the paper also discusses some important philosophical and sociological considerations that have remained largely unexplored in the medical press. Finally, the paper deals with practical matters, including how the general practitioner might be involved.
Marshfield Clinic, physician networks, and the exercise of monopoly power.
Greenberg, W
1998-01-01
OBJECTIVE: Antitrust enforcement can improve the performance of large, vertically integrated physician-hospital organizations (PHOs). Objective: To examine the recent court decisions in the Blue Cross and Blue Shield United of Wisconsin v. Marshfield Clinic antitrust case to understand better the benefits and costs of vertical integration in healthcare. SUMMARY AND CONCLUSIONS: Vertical integration in the Marshfield Clinic may have had the benefits of reducing transactions and uncertainty costs while improving the coordination between ambulatory and inpatient visits, but at the cost of Marshfield Clinic's monopolizing of physician services and foreclosing of HMO entry in northwest Wisconsin. The denial of hospital staff privileges to non-Marshfield Clinic physicians combined with certificate-of-need regulations impeded physician entry and solidified Marshfield Clinic's monopoly position. Enforcement efforts of recent antitrust guidelines by the U.S. Department of Justice and the Federal Trade Commission will need to address carefully the benefits and costs of vertically integrated systems. PMID:9865229
Education, Training, and Mentorship of Caregivers of Canadians Experiencing a Life-Limiting Illness.
Williams, Allison M
2018-01-01
Research suggests that caregiver preparedness is essential to minimizing the negative impacts of caregiving. Not being prepared is associated with fear, anxiety, stress, and feelings of insufficiency/uncertainty specific to the caregiver role. To determine what resources are required to ensure adequate education, training, and mentorship for caregivers of Canadians experiencing a life-limiting illness. Informed by the Ispos Reid survey, the methods for this article involved a rapid literature review that addressed caregiver experiences, needs and issues as they related to health, quality of life, and well-being. Given the burden of care, caregiver education, training, and mentorship are suggested to be best met through the palliative navigator model, wherein the patient-caregiver dyad is recognized as an integrated unit of care. The palliative navigator approach is a key role in the education, training, and mentorship of caregivers.
40 CFR 60.2991 - What incineration units must I address in my State plan?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What incineration units must I address... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9, 2004 Applicability of State Plans § 60.2991 What incineration units must I address in my State...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
Extrapolation, uncertainty factors, and the precautionary principle.
Steel, Daniel
2011-09-01
This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. Copyright © 2011 Elsevier Ltd. All rights reserved.
A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty
Friedel, Michael J.
2011-01-01
This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.
A multi-fidelity analysis selection method using a constrained discrete optimization formulation
NASA Astrophysics Data System (ADS)
Stults, Ian C.
The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method which will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results. In many conceptual design programs, only limited data is available for quantifying model uncertainty. Because of this data sparsity, traditional probabilistic means for quantifying uncertainty should be reconsidered. This research proposes to instead quantify model uncertainty using an evidence theory formulation (also referred to as Dempster-Shafer theory) in lieu of the traditional probabilistic approach. Specific weaknesses in using evidence theory for quantifying model uncertainty are identified and addressed for the purposes of the Fidelity Selection Problem. A series of experiments was conducted to address these weaknesses using n-dimensional optimization test functions. These experiments found that model uncertainty present in analyses with 4 or fewer input variables could be effectively quantified using a strategic distribution creation method; if more than 4 input variables exist, a Frontier Finding Particle Swarm Optimization should instead be used. Once model uncertainty in contributing analysis code choices has been quantified, a selection method is required to determine which of these choices should be used in simulations. Because much of the selection done for engineering problems is driven by the physics of the problem, these are poor candidate problems for testing the true fitness of a candidate selection method. Specifically moderate and high dimensional problems' variability can often be reduced to only a few dimensions and scalability often cannot be easily addressed. For these reasons a simple academic function was created for the uncertainty quantification, and a canonical form of the Fidelity Selection Problem (FSP) was created. Fifteen best- and worst-case scenarios were identified in an effort to challenge the candidate selection methods both with respect to the characteristics of the tradeoff between time cost and model uncertainty and with respect to the stringency of the constraints and problem dimensionality. The results from this experiment show that a Genetic Algorithm (GA) was able to consistently find the correct answer, but under certain circumstances, a discrete form of Particle Swarm Optimization (PSO) was able to find the correct answer more quickly. To better illustrate how the uncertainty quantification and discrete optimization might be conducted for a "real world" problem, an illustrative example was conducted using gas turbine engines.
Hayes, Margaret M; Chatterjee, Souvik; Schwartzstein, Richard M
2017-04-01
Critical thinking, the capacity to be deliberate about thinking, is increasingly the focus of undergraduate medical education, but is not commonly addressed in graduate medical education. Without critical thinking, physicians, and particularly residents, are prone to cognitive errors, which can lead to diagnostic errors, especially in a high-stakes environment such as the intensive care unit. Although challenging, critical thinking skills can be taught. At this time, there is a paucity of data to support an educational gold standard for teaching critical thinking, but we believe that five strategies, routed in cognitive theory and our personal teaching experiences, provide an effective framework to teach critical thinking in the intensive care unit. The five strategies are: make the thinking process explicit by helping learners understand that the brain uses two cognitive processes: type 1, an intuitive pattern-recognizing process, and type 2, an analytic process; discuss cognitive biases, such as premature closure, and teach residents to minimize biases by expressing uncertainty and keeping differentials broad; model and teach inductive reasoning by utilizing concept and mechanism maps and explicitly teach how this reasoning differs from the more commonly used hypothetico-deductive reasoning; use questions to stimulate critical thinking: "how" or "why" questions can be used to coach trainees and to uncover their thought processes; and assess and provide feedback on learner's critical thinking. We believe these five strategies provide practical approaches for teaching critical thinking in the intensive care unit.
Chatterjee, Souvik; Schwartzstein, Richard M.
2017-01-01
Critical thinking, the capacity to be deliberate about thinking, is increasingly the focus of undergraduate medical education, but is not commonly addressed in graduate medical education. Without critical thinking, physicians, and particularly residents, are prone to cognitive errors, which can lead to diagnostic errors, especially in a high-stakes environment such as the intensive care unit. Although challenging, critical thinking skills can be taught. At this time, there is a paucity of data to support an educational gold standard for teaching critical thinking, but we believe that five strategies, routed in cognitive theory and our personal teaching experiences, provide an effective framework to teach critical thinking in the intensive care unit. The five strategies are: make the thinking process explicit by helping learners understand that the brain uses two cognitive processes: type 1, an intuitive pattern-recognizing process, and type 2, an analytic process; discuss cognitive biases, such as premature closure, and teach residents to minimize biases by expressing uncertainty and keeping differentials broad; model and teach inductive reasoning by utilizing concept and mechanism maps and explicitly teach how this reasoning differs from the more commonly used hypothetico-deductive reasoning; use questions to stimulate critical thinking: “how” or “why” questions can be used to coach trainees and to uncover their thought processes; and assess and provide feedback on learner’s critical thinking. We believe these five strategies provide practical approaches for teaching critical thinking in the intensive care unit. PMID:28157389
Uncertainty in Climate Change Research: An Integrated Approach
NASA Astrophysics Data System (ADS)
Mearns, L.
2017-12-01
Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.
MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Dyk, J; Palta, J; Bortfeld, T
2014-06-15
Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “howmore » do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.« less
The Shape of Ecosystem Management to Come: Anticipating Risks and Fostering Resilience
Seidl, Rupert
2014-01-01
Global change is increasingly challenging the sustainable provisioning of ecosystem services to society. Addressing future uncertainty and risk has therefore become a central problem of ecosystem management. With risk management and resilience-based stewardship, two contrasting approaches have been proposed to address this issue. Whereas one is concentrated on anticipating and mitigating risks, the other is focused on fostering the ability to absorb perturbations and maintain desired properties. While they have hitherto been discussed largely separately in the literature, I here propose a unifying framework of anticipating risks and fostering resilience in ecosystem management. Anticipatory action is advocated when the predictability of risk is high and sufficient knowledge to address it is available. Conversely, in situations in which predictability and knowledge are limited, resilience-based measures are paramount. I conclude that, by adopting a purposeful combination of insights from risk and resilience research, we can make ecosystem services provisioning more robust to future uncertainty and change. PMID:25729079
NASA Astrophysics Data System (ADS)
Chou, Shuo-Ju
2011-12-01
In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.
Al, Maiwenn J; Feenstra, Talitha L; Hout, Ben A van
2005-07-01
This paper addresses the problem of how to value health care programmes with different ratios of costs to effects, specifically when taking into account that these costs and effects are uncertain. First, the traditional framework of maximising health effects with a given health care budget is extended to a flexible budget using a value function over money and health effects. Second, uncertainty surrounding costs and effects is included in the model using expected utility. Other approaches to uncertainty that do not specify a utility function are discussed and it is argued that these also include implicit notions about risk attitude.
Introduction to the Special Issue on Climate Ethics: Uncertainty, Values and Policy.
Roeser, Sabine
2017-10-01
Climate change is a pressing phenomenon with huge potential ethical, legal and social policy implications. Climate change gives rise to intricate moral and policy issues as it involves contested science, uncertainty and risk. In order to come to scientifically and morally justified, as well as feasible, policies, targeting climate change requires an interdisciplinary approach. This special issue will identify the main challenges that climate change poses from social, economic, methodological and ethical perspectives by focusing on the complex interrelations between uncertainty, values and policy in this context. This special issue brings together scholars from economics, social sciences and philosophy in order to address these challenges.
NASA Technical Reports Server (NTRS)
Thome, K.
2016-01-01
Knowledge of uncertainties and errors are essential for comparisons of remote sensing data across time, space, and spectral domains. Vicarious radiometric calibration is used to demonstrate the need for uncertainty knowledge and to provide an example error budget. The sample error budget serves as an example of the questions and issues that need to be addressed by the calibrationvalidation community as accuracy requirements for imaging spectroscopy data will continue to become more stringent in the future. Error budgets will also be critical to ensure consistency between the range of imaging spectrometers expected to be launched in the next five years.
Adaptive quaternion tracking with nonlinear extended state observer
NASA Astrophysics Data System (ADS)
Bai, Yu-liang; Wang, Xiao-gang; Xu, Jiang-tao; Cui, Nai-gang
2017-10-01
This paper addresses the problem of attitude tracking for spacecraft in the presence of uncertainties in moments of inertia and environmental disturbances. An adaptive quaternion tracking control is combined with a nonlinear extended state observer and the disturbances compensated for in each sampling period. The tracking controller is proved to asymptotically track a prescribed motion in the presence of these uncertainties. Simulations of a nano-spacecraft demonstrate a significant improvement in pointing accuracy and tracking error when compared to a conventional attitude controller. The proposed tracking control is completely deterministic, simple to implement, does not require knowledge of the uncertainties and does not suffer from chattering.
Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL
NASA Astrophysics Data System (ADS)
Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy
2016-04-01
Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.
Robustness of Feedback Systems with Several Modelling Errors
1990-06-01
Patterson AFB, OH 45433-6553 to help us maintain a current mailing list. Copies of this report should not be returned unless return is required by security...Wright Research (If applicable) and Development Center WRDC/FIGC F33615-88-C-3601 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS...feedback systems with several sources of modelling uncertainty. We assume that each source of uncertainty is modelled as a stable unstructured
NASA Astrophysics Data System (ADS)
Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen
2013-04-01
Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.
Dettmer, Jan; Dosso, Stan E
2012-10-01
This paper develops a trans-dimensional approach to matched-field geoacoustic inversion, including interacting Markov chains to improve efficiency and an autoregressive model to account for correlated errors. The trans-dimensional approach and hierarchical seabed model allows inversion without assuming any particular parametrization by relaxing model specification to a range of plausible seabed models (e.g., in this case, the number of sediment layers is an unknown parameter). Data errors are addressed by sampling statistical error-distribution parameters, including correlated errors (covariance), by applying a hierarchical autoregressive error model. The well-known difficulty of low acceptance rates for trans-dimensional jumps is addressed with interacting Markov chains, resulting in a substantial increase in efficiency. The trans-dimensional seabed model and the hierarchical error model relax the degree of prior assumptions required in the inversion, resulting in substantially improved (more realistic) uncertainty estimates and a more automated algorithm. In particular, the approach gives seabed parameter uncertainty estimates that account for uncertainty due to prior model choice (layering and data error statistics). The approach is applied to data measured on a vertical array in the Mediterranean Sea.
Dawson, Gretchen; Madsen, Lydia T; Dains, Joyce E
2016-12-01
Fear of cancer recurrence (FCR) is one of the largest unmet needs in the breast cancer survivor population. This review addresses this unmet need with the question. The purpose of this article is to better understand potential interventions to manage FCR when caring for breast cancer survivors. Databases used were PubMed, CINAHL®, Google Scholar, EMBASE, and Scopus. Articles published in English from 2009-2014 with female breast cancer survivors and interventions that address FCR as an endpoint or outcome measure or objectively illustrate an improvement in FCR were included. One hundred ninety-eight articles were initially identified in this literature review search. Upon detailed review of content for relevance, seven articles met criteria to be included in this review. This literature review provided current evidence of published interventions to manage uncertainty in the female breast cancer survivor population, as well as future research recommendations. Interventions surrounding being mindful, managing uncertainty, having more effective patient-provider communication, and handling stress through counseling are options for managing FCR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denman, Matthew R.; Brooks, Dusty Marie
Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on keymore » figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .« less
Wohlers, Anton E
2010-09-01
This paper examines whether national differences in political culture add an explanatory dimension to the formulation of policy in the area of biotechnology, especially with respect to genetically modified food. The analysis links the formulation of protective regulatory policies governing genetically modified food to both country and region-specific differences in uncertainty tolerance levels and risk perceptions in the United States, Canada, and European Union. Based on polling data and document analysis, the findings illustrate that these differences matter. Following a mostly opportunistic risk perception within an environment of high tolerance for uncertainty, policymakers in the United States and Canada modified existing regulatory frameworks that govern genetically modified food in their respective countries. In contrast, the mostly cautious perception of new food technologies and low tolerance for uncertainty among European Union member states has contributed to the creation of elaborate and stringent regulatory policies governing genetically modified food.
NASA Astrophysics Data System (ADS)
Dykema, John A.; Anderson, James G.
2006-06-01
A methodology to achieve spectral thermal radiance measurements from space with demonstrable on-orbit traceability to the International System of Units (SI) is described. This technique results in measurements of infrared spectral radiance R(\\tilde {\\upsilon }) , with spectral index \\tilde {\\upsilon } in cm-1, with a relative combined uncertainty u_c[R(\\tilde {\\upsilon })] of 0.0015 (k = 1) for the average mid-infrared radiance emitted by the Earth. This combined uncertainty, expressed in brightness temperature units, is equivalent to ±0.1 K at 250 K at 750 cm-1. This measurement goal is achieved by utilizing a new method for infrared scale realization combined with an instrument design optimized to minimize component uncertainties and admit tests of radiometric performance. The SI traceability of the instrument scale is established by evaluation against source-based and detector-based infrared scales in defined laboratory protocols before launch. A novel strategy is executed to ensure fidelity of on-orbit calibration to the pre-launch scale. This strategy for on-orbit validation relies on the overdetermination of instrument calibration. The pre-launch calibration against scales derived from physically independent paths to the base SI units provides the foundation for a critical analysis of the overdetermined on-orbit calibration to establish an SI-traceable estimate of the combined measurement uncertainty. Redundant calibration sources and built-in diagnostic tests to assess component measurement uncertainties verify the SI traceability of the instrument calibration over the mission lifetime. This measurement strategy can be realized by a practical instrument, a prototype Fourier-transform spectrometer under development for deployment on a small satellite. The measurement record resulting from the methodology described here meets the observational requirements for climate monitoring and climate model testing and improvement.
Final Technical Report: Distributed Controls for High Penetrations of Renewables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrne, Raymond H.; Neely, Jason C.; Rashkin, Lee J.
2015-12-01
The goal of this effort was to apply four potential control analysis/design approaches to the design of distributed grid control systems to address the impact of latency and communications uncertainty with high penetrations of photovoltaic (PV) generation. The four techniques considered were: optimal fixed structure control; Nyquist stability criterion; vector Lyapunov analysis; and Hamiltonian design methods. A reduced order model of the Western Electricity Coordinating Council (WECC) developed for the Matlab Power Systems Toolbox (PST) was employed for the study, as well as representative smaller systems (e.g., a two-area, three-area, and four-area power system). Excellent results were obtained with themore » optimal fixed structure approach, and the methodology we developed was published in a journal article. This approach is promising because it offers a method for designing optimal control systems with the feedback signals available from Phasor Measurement Unit (PMU) data as opposed to full state feedback or the design of an observer. The Nyquist approach inherently handles time delay and incorporates performance guarantees (e.g., gain and phase margin). We developed a technique that works for moderate sized systems, but the approach does not scale well to extremely large system because of computational complexity. The vector Lyapunov approach was applied to a two area model to demonstrate the utility for modeling communications uncertainty. Application to large power systems requires a method to automatically expand/contract the state space and partition the system so that communications uncertainty can be considered. The Hamiltonian Surface Shaping and Power Flow Control (HSSPFC) design methodology was selected to investigate grid systems for energy storage requirements to support high penetration of variable or stochastic generation (such as wind and PV) and loads. This method was applied to several small system models.« less
NASA Astrophysics Data System (ADS)
Ayub, R.; Obenour, D. R.; Keyworth, A. J.; Genereux, D. P.; Mahinthakumar, K.
2016-12-01
Groundwater contamination by nutrients (nitrogen and phosphorus) is a major concern in water table aquifers that underlie agricultural areas in the mid-Atlantic Coastal Plain of the United States. High nutrient concentrations leaching into shallow groundwater can lead to human health problems and eutrophication of receiving surface waters. Liquid manure from concentrated animal feeding operations (CAFOs) stored in open-air lagoons and applied to spray fields can be a significant source of nutrients to groundwater, along with septic waste. In this study, we developed a model-based methodology for source apportionment and vulnerability assessment using sparse groundwater quality sampling measurements for Duplin County, North Carolina (NC), obtained by the NC Department of Environmental Quality (NC DEQ). This model provides information relevant to management by estimating the nutrient transport through the aquifer from different sources and addressing the uncertainty of nutrient contaminant propagation. First, the zones of influence (dependent on nutrient pathways) for individual groundwater monitoring wells were identified using a two-dimensional vertically averaged groundwater flow and transport model incorporating geologic uncertainty for the surficial aquifer system. A multiple linear regression approach is then applied to estimate the contribution weights for different nutrient source types using the nutrient measurements from monitoring wells and the potential sources within each zone of influence. Using the source contribution weights and their uncertainty, a probabilistic vulnerability assessment of the study area due to nutrient contamination is performed. Knowledge of the contribution of different nutrient sources to contamination at receptor locations (e.g., private wells, municipal wells, stream beds etc.) will be helpful in planning and implementation of appropriate mitigation measures.
NASA Astrophysics Data System (ADS)
Raseman, W. J.; Kasprzyk, J. R.; Rosario-Ortiz, F.; Summers, R. S.; Stewart, J.; Livneh, B.
2016-12-01
To promote public health, the United States Environmental Protection Agency (US EPA), and similar entities around the world enact strict laws to regulate drinking water quality. These laws, such as the Stage 1 and 2 Disinfectants and Disinfection Byproducts (D/DBP) Rules, come at a cost to water treatment plants (WTPs) which must alter their operations and designs to meet more stringent standards and the regulation of new contaminants of concern. Moreover, external factors such as changing influent water quality due to climate extremes and climate change, may force WTPs to adapt their treatment methods. To grapple with these issues, decision support systems (DSSs) have been developed to aid WTP operation and planning. However, there is a critical need to better address long-term decision making for WTPs. In this poster, we propose a DSS framework for WTPs for long-term planning, which improves upon the current treatment of deep uncertainties within the overall potable water system including the impact of climate on influent water quality and uncertainties in treatment process efficiencies. We present preliminary results exploring how a multi-objective evolutionary algorithm (MOEA) search can be coupled with models of WTP processes to identify high-performing plans for their design and operation. This coupled simulation-optimization technique uses Borg MOEA, an auto-adaptive algorithm, and the Water Treatment Plant Model, a simulation model developed by the US EPA to assist in creating the D/DBP Rules. Additionally, Monte Carlo sampling methods were used to study the impact of uncertainty of influent water quality on WTP decision-making and generate plans for robust WTP performance.
Musella, Vincenzo; Rinaldi, Laura; Lagazio, Corrado; Cringoli, Giuseppe; Biggeri, Annibale; Catelan, Dolores
2014-09-15
Model-based geostatistics and Bayesian approaches are appropriate in the context of Veterinary Epidemiology when point data have been collected by valid study designs. The aim is to predict a continuous infection risk surface. Little work has been done on the use of predictive infection probabilities at farm unit level. In this paper we show how to use predictive infection probability and related uncertainty from a Bayesian kriging model to draw a informative samples from the 8794 geo-referenced sheep farms of the Campania region (southern Italy). Parasitological data come from a first cross-sectional survey carried out to study the spatial distribution of selected helminths in sheep farms. A grid sampling was performed to select the farms for coprological examinations. Faecal samples were collected for 121 sheep farms and the presence of 21 different helminths were investigated using the FLOTAC technique. The 21 responses are very different in terms of geographical distribution and prevalence of infection. The observed prevalence range is from 0.83% to 96.69%. The distributions of the posterior predictive probabilities for all the 21 parasites are very heterogeneous. We show how the results of the Bayesian kriging model can be used to plan a second wave survey. Several alternatives can be chosen depending on the purposes of the second survey: weight by posterior predictive probabilities, their uncertainty or combining both information. The proposed Bayesian kriging model is simple, and the proposed samping strategy represents a useful tool to address targeted infection control treatments and surbveillance campaigns. It is easily extendable to other fields of research. Copyright © 2014 Elsevier B.V. All rights reserved.
USGS Polar Temperature Logging System, Description and Measurement Uncertainties
Clow, Gary D.
2008-01-01
This paper provides an updated technical description of the USGS Polar Temperature Logging System (PTLS) and a complete assessment of the measurement uncertainties. This measurement system is used to acquire subsurface temperature data for climate-change detection in the polar regions and for reconstructing past climate changes using the 'borehole paleothermometry' inverse method. Specifically designed for polar conditions, the PTLS can measure temperatures as low as -60 degrees Celsius with a sensitivity ranging from 0.02 to 0.19 millikelvin (mK). A modular design allows the PTLS to reach depths as great as 4.5 kilometers with a skid-mounted winch unit or 650 meters with a small helicopter-transportable unit. The standard uncertainty (uT) of the ITS-90 temperature measurements obtained with the current PTLS range from 3.0 mK at -60 degrees Celsius to 3.3 mK at 0 degrees Celsius. Relative temperature measurements used for borehole paleothermometry have a standard uncertainty (urT) whose upper limit ranges from 1.6 mK at -60 degrees Celsius to 2.0 mK at 0 degrees Celsius. The uncertainty of a temperature sensor's depth during a log depends on specific borehole conditions and the temperature near the winch and thus must be treated on a case-by-case basis. However, recent experience indicates that when logging conditions are favorable, the 4.5-kilometer system is capable of producing depths with a standard uncertainty (uZ) on the order of 200-250 parts per million.
Climate Science: An Empirical Example of Postnormal Science.
NASA Astrophysics Data System (ADS)
Bray, Dennis; von Storch, Hans
1999-03-01
This paper addresses the views regarding the certainty and uncertainty of climate science knowledge held by contemporary climate scientists. More precisely, it addresses the extension of this knowledge into the social and political realms as per the definition of postnormal science. The data for the analysis is drawn from a response rate of approximately 40% from a survey questionnaire mailed to 1000 scientists in Germany, the United States, and Canada, and from a series of in-depth interviews with leading scientists in each country. The international nature of the sample allows for cross-cultural comparisons.With respect to the relative scientific discourse, similar assessments of the current state of knowledge are held by the respondents of each country. Almost all scientists agreed that the skill of contemporary models is limited. Minor differences were notable. Scientists from the United States were less convinced of the skills of the models than their German counterparts and, as would be expected under such circumstances, North American scientists perceived the need for societal and political responses to be less urgent than their German counterparts. The international consensus was, however, apparent regarding the utility of the knowledge to date: climate science has provided enough knowledge so that the initiation of abatement measures is warranted. However, consensus also existed regarding the current inability to explicitly specify detrimental effects that might result from climate change. This incompatibility between the state of knowledge and the calls for action suggests that, to some degree at least, scientific advice is a product of both scientific knowledge and normative judgment, suggesting a socioscientific construction of the climate change issue.
The law (and politics) of safe injection facilities in the United States.
Beletsky, Leo; Davis, Corey S; Anderson, Evan; Burris, Scott
2008-02-01
Safe injection facilities (SIFs) have shown promise in reducing harms and social costs associated with injection drug use. Favorable evaluations elsewhere have raised the issue of their implementation in the United States. Recognizing that laws shape health interventions targeting drug users, we analyzed the legal environment for publicly authorized SIFs in the United States. Although states and some municipalities have the power to authorize SIFs under state law, federal authorities could still interfere with these facilities under the Controlled Substances Act. A state- or locally-authorized SIF could proceed free of legal uncertainty only if federal authorities explicitly authorized it or decided not to interfere. Given legal uncertainty, and the similar experience with syringe exchange programs, we recommend a process of sustained health research, strategic advocacy, and political deliberation.
Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.
2012-01-01
This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.
TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don; Bowman, Stephen M
2009-01-01
This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less
Li, Yongming; Tong, Shaocheng
2017-12-01
In this paper, an adaptive fuzzy output constrained control design approach is addressed for multi-input multioutput uncertain stochastic nonlinear systems in nonstrict-feedback form. The nonlinear systems addressed in this paper possess unstructured uncertainties, unknown gain functions and unknown stochastic disturbances. Fuzzy logic systems are utilized to tackle the problem of unknown nonlinear uncertainties. The barrier Lyapunov function technique is employed to solve the output constrained problem. In the framework of backstepping design, an adaptive fuzzy control design scheme is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.
UNCERTAINTY IN EARLY OCCUPATIONAL ASPIRATIONS: ROLE EXPLORATION OR AIMLESSNESS?
Staff, Jeremy; Harris, Angel; Sabates, Ricardo; Briddell, Laine
2014-01-01
Many youth in the United States lack clear occupational aspirations. This uncertainty in achievement ambitions may benefit socioeconomic attainment if it signifies “role exploration,” characterized by career development, continued education, and enduring partnerships. By contrast, uncertainty may diminish attainment if it instead leads to “aimlessness,” involving prolonged education without the acquisition of a degree, residential dependence, and frequent job changes. We use nationally representative data from the National Education Longitudinal Study (NELS) to examine how uncertainty in occupational aspirations in adolescence (age 16) affects wage attainments in young adulthood (age 26). Results suggest that youth with uncertain career ambitions earn significantly lower hourly wages in young adulthood than youth with professional and non-professional aspirations, supporting the view that uncertainty heightens the risk of labor-market problems. PMID:25540465
Jeffrey P. Prestemon; Geoffrey H. Donovan
2008-01-01
Making input decisions under climate uncertainty often involves two-stage methods that use expensive and opaque transfer functions. This article describes an alternative, single-stage approach to such decisions using forecasting methods. The example shown is for preseason fire suppression resource contracting decisions faced by the United States Forest Service. Two-...
Nicholas A. Fisichelli; Scott R. Abella; Matthew Peters; Frank J. Krist
2014-01-01
The US National Park Service (NPS) manages over 8900 km2 of forest area in the eastern United States where climate change and nonnative species are altering forest structure, composition, and processes. Understanding potential forest change in response to climate, differences in habitat projections among models (uncertainty), and nonnative biotic...
"I Don't Want to Be an Ostrich": Managing Mothers' Uncertainty during BRCA1/2 Genetic Counseling.
Fisher, Carla L; Roccotagliata, Thomas; Rising, Camella J; Kissane, David W; Glogowski, Emily A; Bylund, Carma L
2017-06-01
Families who face genetic disease risk must learn how to grapple with complicated uncertainties about their health and future on a long-term basis. Women who undergo BRCA 1/2 genetic testing describe uncertainty related to personal risk as well as their loved ones', particularly daughters', risk. The genetic counseling setting is a prime opportunity for practitioners to help mothers manage uncertainty in the moment but also once they leave a session. Uncertainty Management Theory (UMT) helps to illuminate the various types of uncertainty women encounter and the important role of communication in uncertainty management. Informed by UMT, we conducted a thematic analysis of 16 genetic counseling sessions between practitioners and mothers at risk for, or carriers of, a BRCA1/2 mutation. Five themes emerged that represent communication strategies used to manage uncertainty: 1) addresses myths, misunderstandings, or misconceptions; 2) introduces uncertainty related to science; 3) encourages information seeking or sharing about family medical history; 4) reaffirms or validates previous behavior or decisions; and 5) minimizes the probability of personal risk or family members' risk. Findings illustrate the critical role of genetic counseling for families in managing emotionally challenging risk-related uncertainty. The analysis may prove beneficial to not only genetic counseling practice but generations of families at high risk for cancer who must learn strategic approaches to managing a complex web of uncertainty that can challenge them for a lifetime.
Hoffmann, Sabine; Rage, Estelle; Laurier, Dominique; Laroche, Pierre; Guihenneuc, Chantal; Ancelet, Sophie
2017-02-01
Many occupational cohort studies on underground miners have demonstrated that radon exposure is associated with an increased risk of lung cancer mortality. However, despite the deleterious consequences of exposure measurement error on statistical inference, these analyses traditionally do not account for exposure uncertainty. This might be due to the challenging nature of measurement error resulting from imperfect surrogate measures of radon exposure. Indeed, we are typically faced with exposure uncertainty in a time-varying exposure variable where both the type and the magnitude of error may depend on period of exposure. To address the challenge of accounting for multiplicative and heteroscedastic measurement error that may be of Berkson or classical nature, depending on the year of exposure, we opted for a Bayesian structural approach, which is arguably the most flexible method to account for uncertainty in exposure assessment. We assessed the association between occupational radon exposure and lung cancer mortality in the French cohort of uranium miners and found the impact of uncorrelated multiplicative measurement error to be of marginal importance. However, our findings indicate that the retrospective nature of exposure assessment that occurred in the earliest years of mining of this cohort as well as many other cohorts of underground miners might lead to an attenuation of the exposure-risk relationship. More research is needed to address further uncertainties in the calculation of lung dose, since this step will likely introduce important sources of shared uncertainty.
Exploring uncertainty in the Earth Sciences - the potential field perspective
NASA Astrophysics Data System (ADS)
Saltus, R. W.; Blakely, R. J.
2013-12-01
Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.
NASA Astrophysics Data System (ADS)
Hey, Anthony J. G.; Walters, Patrick
This book provides a descriptive, popular account of quantum physics. The basic topics addressed include: waves and particles, the Heisenberg uncertainty principle, the Schroedinger equation and matter waves, atoms and nuclei, quantum tunneling, the Pauli exclusion principle and the elements, quantum cooperation and superfluids, Feynman rules, weak photons, quarks, and gluons. The applications of quantum physics to astrophyics, nuclear technology, and modern electronics are addressed.
NASA Astrophysics Data System (ADS)
Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin
2015-04-01
The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level records to produce continuous discharge time series. The management of rating curves is also done using BAREME. The BaRatin method (Le Coz et al., 2014) was developed as a Bayesian approach of rating curve development and uncertainty analysis. Since BaRatin accounts for the individual uncertainties of gauging data used to build the rating curve, it was coupled with BAREME. The BaRatin method is still undergoing development and research, in particular to address non univocal or time-varying stage-discharge relations, due to hysteresis, variable backwater, rating shifts, etc. A new interface including new options is under development. The next steps are now to propagate the uncertainties of water level records, through uncertain rating curves, up to discharge time series and derived variables (e.g. annual mean flow) and statistics (e.g. flood quantiles). Bayesian tools are already available for both tasks but further validation and development is necessary for their integration in the operational data workflow of the French NHS. References Le Coz, J., Camenen, B., Peyrard, X., Dramais, G., 2012. Uncertainty in open-channel discharges measured with the velocity-area method. Flow Measurement and Instrumentation 26, 18-29. Le Coz, J., Renard, B., Bonnifait, L., Branger, F., Le Boursicaud, R., 2014. Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.
Ro, Annie
2014-01-01
Researchers have become increasingly interested in the health patterns of immigrants with longer residence in the United States, as this reveals the health consequences of integration processes. The negative acculturation effect has been the dominant interpretation of duration patterns, despite empirical and theoretical uncertainties about this assumption. This theory assumes that immigrant health declines with longer residence in the United States because of poorer health behaviors and health risks that reflect Americanized lifestyles. This paper reviews the empirical support for the negative acculturation theory among Asian immigrants to determine if and when it is an appropriate interpretation for duration patterns. I conclude that empirical inconsistencies and methodological issues limit the negative acculturation theory as the primary interpretation for duration patterns. First, there is no consistent evidence that health behaviors decline with time. There is also substantial group heterogeneity in duration patterns as well as heterogeneity across health outcomes. The literature has not adequately addressed methodological shortcomings, such as confounding by cohort effects or non-linear duration patterns. Length of residence in the United States is still an important aspect of Asian immigrant health, but the mechanisms of this relationship are still understudied. I propose alternative frameworks between duration and health that consider environmental influences and end with future research directions to explore research gaps. PMID:25111874
1984-08-01
Requirements 63 3.3.1 Hypothesis 4: Relationship Between Unit Technology and Information Source Requirements..................64 3.3.2 Hypothesis 5... Relationship Between Environ- mental Uncertainty and Information Source Requirements..................65 3.3.3 Hypothesis 6: Relationship Between Inter-Unit...Sources. ............ 67 3.4.1 Hypothesis 1: Relationship Between Unit Structure and the Accessibility and Quality of Information Sources .. ........ 68
Uncertainty aggregation and reduction in structure-material performance prediction
NASA Astrophysics Data System (ADS)
Hu, Zhen; Mahadevan, Sankaran; Ao, Dan
2018-02-01
An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.
Wildfire Decision Making Under Uncertainty
NASA Astrophysics Data System (ADS)
Thompson, M.
2013-12-01
Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reda, I.
2011-07-01
The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections tomore » solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.« less
Error Analysis of CM Data Products Sources of Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less
Uncertainty in Bohr's response to the Heisenberg microscope
NASA Astrophysics Data System (ADS)
Tanona, Scott
2004-09-01
In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.
Jakeman, Anthony J.; Jakeman, John Davis
2018-03-14
Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less
Application of fuzzy system theory in addressing the presence of uncertainties
NASA Astrophysics Data System (ADS)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.
2015-02-01
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakeman, Anthony J.; Jakeman, John Davis
Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em
2017-02-01
Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.
Uncertainty loops in travel-time tomography from nonlinear wave physics.
Galetti, Erica; Curtis, Andrew; Meles, Giovanni Angelo; Baptie, Brian
2015-04-10
Estimating image uncertainty is fundamental to guiding the interpretation of geoscientific tomographic maps. We reveal novel uncertainty topologies (loops) which indicate that while the speeds of both low- and high-velocity anomalies may be well constrained, their locations tend to remain uncertain. The effect is widespread: loops dominate around a third of United Kingdom Love wave tomographic uncertainties, changing the nature of interpretation of the observed anomalies. Loops exist due to 2nd and higher order aspects of wave physics; hence, although such structures must exist in many tomographic studies in the physical sciences and medicine, they are unobservable using standard linearized methods. Higher order methods might fruitfully be adopted.
Negotiating uncertainty: the transitional process of adapting to life with HIV.
Perrett, Stephanie E; Biley, Francis C
2013-01-01
Glaser's (1978) grounded-theory method was used to investigate the transitional process of adapting to life with HIV. Semistructured interviews took place with 8 male HIV-infected participants recruited from a clinic in South Wales, United Kingdom. Data analysis used open, substantive, and theoretical coding. Adapting to a life with HIV infection emerged as a process of adapting to uncertainty with "negotiating uncertainty" as a core concept. Seven subcategories represented movements between bipolar opposites labeled "anticipating hopelessness" and "regaining optimism." This work progresses the theoretical concepts of transitions, uncertainty, and adaptation in relation to the HIV experience. Copyright © 2013 Association of Nurses in AIDS Care. Published by Elsevier Inc. All rights reserved.
Einstein, Danielle A
2014-09-01
This study reviews research on the construct of intolerance of uncertainty (IU). A recent factor analysis ( Journal of Anxiety Disorders , 25 , 2012, p. 533) has been used to extend the transdiagnostic model articulated by Mansell (2005, p. 141) to focus on the role of IU as a facet of the model that is important to address in treatment. Research suggests that individual differences in IU may compromise resilience and that individuals high in IU are susceptible to increased negative affect. The model extension provides a guide for the treatment of clients presenting with uncertainty in the context of either a single disorder or several comorbid disorders. By applying the extension, the clinician is assisted to explore two facets of IU, "Need for Predictability" and "Uncertainty Arousal."
Einstein, Danielle A
2014-01-01
This study reviews research on the construct of intolerance of uncertainty (IU). A recent factor analysis (Journal of Anxiety Disorders, 25, 2012, p. 533) has been used to extend the transdiagnostic model articulated by Mansell (2005, p. 141) to focus on the role of IU as a facet of the model that is important to address in treatment. Research suggests that individual differences in IU may compromise resilience and that individuals high in IU are susceptible to increased negative affect. The model extension provides a guide for the treatment of clients presenting with uncertainty in the context of either a single disorder or several comorbid disorders. By applying the extension, the clinician is assisted to explore two facets of IU, “Need for Predictability” and “Uncertainty Arousal.” PMID:25400336
NASA Astrophysics Data System (ADS)
Jia, Chaoqing; Hu, Jun; Chen, Dongyan; Liu, Yurong; Alsaadi, Fuad E.
2018-07-01
In this paper, we discuss the event-triggered resilient filtering problem for a class of time-varying systems subject to stochastic uncertainties and successive packet dropouts. The event-triggered mechanism is employed with hope to reduce the communication burden and save network resources. The stochastic uncertainties are considered to describe the modelling errors and the phenomenon of successive packet dropouts is characterized by a random variable obeying the Bernoulli distribution. The aim of the paper is to provide a resilient event-based filtering approach for addressed time-varying systems such that, for all stochastic uncertainties, successive packet dropouts and filter gain perturbation, an optimized upper bound of the filtering error covariance is obtained by designing the filter gain. Finally, simulations are provided to demonstrate the effectiveness of the proposed robust optimal filtering strategy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sung, Yixing; Adams, Brian M.; Witkowski, Walter R.
2011-04-01
The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focusedmore » on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).« less
Steady-state bumpless transfer under controller uncertainty using the state/output feedback topology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, K.; Lee, A.H.; Bentsman, J.
2006-01-15
Linear quadratic (LQ) bumpless transfer design introduced recently by Turner and Walker gives a very convenient and straightforward computational procedure for the steady-state bumpless transfer operator synthesis. It is, however, found to be incapable of providing convergence of the output of the offline controller to that of the online controller in several industrial applications, producing bumps in the plant output in the wake of controller transfer. An examination of this phenomenon reveals that the applications in question are characterized by a significant mismatch, further referred to as controller uncertainty, between the dynamics of the implemented controllers and their models usedmore » in the transfer operator computation. To address this problem, while retaining the convenience of the Turner and Walker design, a novel state/output feedback bumpless transfer topology is introduced that employs the nominal state of the offline controller and, through the use of an additional controller/model mismatch compensator, also the offline controller output. A corresponding steady-state bumpless transfer design procedure along with the supporting theory is developed for a large class of systems. Due to these features, it is demonstrated to solve a long-standing problem of high-quality steady-state bumpless transfer from the industry standard low-order nonlinear multiloop PID-based controllers to the modern multiinput-multioutput (MIMO) robust controllers in the megawatt/throttle pressure control of a typical coal-fired boiler/turbine unit.« less
A probabilistic tornado wind hazard model for the continental United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hossain, Q; Kimball, J; Mensing, R
A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less
Waddimba, Anthony C; Scribani, Melissa; Krupa, Nicole; May, John J; Jenkins, Paul
2016-10-22
Widespread dissatisfaction among United States (U.S.) clinicians could endanger ongoing reforms. Practitioners in rural/underserved areas withstand stressors that are unique to or accentuated in those settings. Medical professionals employed by integrating delivery systems are often distressed by the cacophony of organizational change(s) that such consolidation portends. We investigated the factors associated with dis/satisfaction with rural practice among doctors/non-physician practitioners employed by an integrated healthcare delivery network serving 9 counties of upstate New York, during a time of organizational transition. We linked administrative data about practice units with cross-sectional data from a self-administered multi-dimensional questionnaire that contained practitioner demographics plus valid scales assessing autonomy/relatedness needs, risk aversion, tolerance for uncertainty/ambiguity, meaningfulness of patient care, and workload. We targeted medical professionals on the institutional payroll for inclusion. We excluded those who retired, resigned or were fired during the study launch, plus members of the advisory board and research team. Fixed-effects beta regressions were performed to test univariate associations between each factor and the percent of time a provider was dis/satisfied. Factors that manifested significant fixed effects were entered into multivariate, inflated beta regression models of the proportion of time that practitioners were dis/satisfied, incorporating clustering by practice unit as a random effect. Of the 473 eligible participants. 308 (65.1 %) completed the questionnaire. 59.1 % of respondents were doctoral-level; 40.9 % mid-level practitioners. Practitioners with heavier workloads and/or greater uncertainty intolerance were less likely to enjoy top-quintile satisfaction; those deriving greater meaning from practice were more likely. Higher meaningfulness and gratified relational needs increased one's likelihood of being in the lowest quintile of dissatisfaction; heavier workload and greater intolerance of uncertainty reduced that likelihood. Practitioner demographics and most practice unit characteristics did not manifest any independent effect. Mutable factors, such as workload, work meaningfulness, relational needs, uncertainty/ambiguity tolerance, and risk-taking attitudes displayed the strongest association with practitioner satisfaction/dissatisfaction, independent of demographics and practice unit characteristics. Organizational efforts should be dedicated to a redesign of group-employment models, including more equitable division of clinical labor, building supportive peer networks, and uncertainty/risk tolerance coaching, to improve the quality of work life among rural practitioners.
NASA Astrophysics Data System (ADS)
Barnawi, Abdulwasa Bakr
Hybrid power generation system and distributed generation technology are attracting more investments due to the growing demand for energy nowadays and the increasing awareness regarding emissions and their environmental impacts such as global warming and pollution. The price fluctuation of crude oil is an additional reason for the leading oil producing countries to consider renewable resources as an alternative. Saudi Arabia as the top oil exporter country in the word announced the "Saudi Arabia Vision 2030" which is targeting to generate 9.5 GW of electricity from renewable resources. Two of the most promising renewable technologies are wind turbines (WT) and photovoltaic cells (PV). The integration or hybridization of photovoltaics and wind turbines with battery storage leads to higher adequacy and redundancy for both autonomous and grid connected systems. This study presents a method for optimal generation unit planning by installing a proper number of solar cells, wind turbines, and batteries in such a way that the net present value (NPV) is minimized while the overall system redundancy and adequacy is maximized. A new renewable fraction technique (RFT) is used to perform the generation unit planning. RFT was tested and validated with particle swarm optimization and HOMER Pro under the same conditions and environment. Renewable resources and load randomness and uncertainties are considered. Both autonomous and grid-connected system designs were adopted in the optimal generation units planning process. An uncertainty factor was designed and incorporated in both autonomous and grid connected system designs. In the autonomous hybrid system design model, the strategy including an additional amount of operation reserve as a percent of the hourly load was considered to deal with resource uncertainty since the battery storage system is the only backup. While in the grid-connected hybrid system design model, demand response was incorporated to overcome the impact of uncertainty and perform energy trading between the hybrid grid utility and main grid utility in addition to the designed uncertainty factor. After the generation unit planning was carried out and component sizing was determined, adequacy evaluation was conducted by calculating the loss of load expectation adequacy index for different contingency criteria considering probability of equipment failure. Finally, a microgrid planning was conducted by finding the proper size and location to install distributed generation units in a radial distribution network.
NASA Astrophysics Data System (ADS)
Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir
2017-06-01
We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.
Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.
Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn
2012-08-01
There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.
Toward an inventory of nitrogen input to the United States
Accurate accounting of nitrogen inputs is increasingly necessary for policy decisions related to aquatic nutrient pollution. Here we synthesize available data to provide the first integrated estimates of the amount and uncertainty of nitrogen inputs to the United States. Abou...
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Chen, Xiang; Kwan, Mei-Po
2015-09-01
We examined the uncertainty of the contextual influences on food access through an analytic framework of the uncertain geographic context problem (UGCoP). We first examined the compounding effects of two kinds of spatiotemporal uncertainties on people's everyday efforts to procure food and then outlined three key dimensions (food access in real time, temporality of the food environment, and perceived nutrition environment) in which research on food access must improve to better represent the contributing environmental influences that operate at the individual level. Guidelines to address the UGCoP in future food access research are provided to account for the multidimensional influences of the food environment on dietary behaviors.
Improving uncertainty estimates: Inter-annual variability in Ireland
NASA Astrophysics Data System (ADS)
Pullinger, D.; Zhang, M.; Hill, N.; Crutchley, T.
2017-11-01
This paper addresses the uncertainty associated with inter-annual variability used within wind resource assessments for Ireland in order to more accurately represent the uncertainties within wind resource and energy yield assessments. The study was undertaken using a total of 16 ground stations (Met Eireann) and corresponding reanalysis datasets to provide an update to previous work on this topic undertaken nearly 20 years ago. The results of the work demonstrate that the previously reported 5.4% of wind speed inter-annual variability is considered to be appropriate, guidance is given on how to provide a robust assessment of IAV using available sources of data including ground stations, MERRA-2 and ERA-Interim.
Uncertainty quantification in Rothermel's Model using an efficient sampling method
Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick
2007-01-01
The purpose of the present work is to quantify parametric uncertainty in Rothermelâs wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...
Analysis of the sources of uncertainty for EDR2 film‐based IMRT quality assurance
Shi, Chengyu; Papanikolaou, Nikos; Yan, Yulong; Weng, Xuejun; Jiang, gyu
2006-01-01
In our institution, patient‐specific quality assurance (QA) for intensity‐modulated radiation therapy (IMRT) is usually performed by measuring the dose to a point using an ion chamber and by measuring the dose to a plane using film. In order to perform absolute dose comparison measurements using film, an accurate calibration curve should be used. In this paper, we investigate the film response curve uncertainty factors, including film batch differences, film processor temperature effect, film digitization, and treatment unit. In addition, we reviewed 50 patient‐specific IMRT QA procedures performed in our institution in order to quantify the sources of error in film‐based dosimetry. Our study showed that the EDR2 film dosimetry can be done with less than 3% uncertainty. The EDR2 film response was not affected by the choice of treatment unit provided the nominal energy was the same. This investigation of the different sources of uncertainties in the film calibration procedure can provide a better understanding of the film‐based dosimetry and can improve quality control for IMRT QA. PACS numbers: 87.86.Cd, 87.53.Xd, 87.57.Nk PMID:17533329
NASA Astrophysics Data System (ADS)
Harthy, M. A.; Gifford, J.
2017-12-01
The Hartselle sandstone is an excellent example of an Oil sand, a resource rich in bitumen. The unit is a light-colored thick-bedded to massive quartzose sandstone, that is widespread across an area from Georgia in the east to Mississippi in the west, and south from Alabama to Kentucky as a northern border. Formation thickness ranges from 0 to more than 150 feet. The unit has been stratigraphically dated to the Middle-Upper Mississippian age. One hypothesis suggests that the sandstone unit formed from the geological remains of barrier islands located in the ocean between Gondwana and Laurentia. The Hartselle is thought to have formed by the movement waves and currents along the shoreline, which carried sand and concentrated it into a set of northwest to southeast trending barrier islands. Transgression-regression events shifted the islands back and forth in relation to the position of the shoreline, leading to the large areal extent of the unit. However, the current data are not enough to explain the geographical position of the Hartselle sandstone unit as it is not running parallel to the ancient shoreline. Another mystery is the source of the sand, some believing the source was from the south (Gondwana) and others that erosion was from the north (Laurentia). Detrital zircon provenance analysis will address the uncertainty in sediment source. We will compare zircon U-Pb age spectra to possible Laurentian and Gondwanan source areas to discriminate between these possibilities. In addition, the age of the youngest detrital zircon population will provide additional constraints on the maximum age of deposition for the unit. These detrital ages will also help us to understand the tectonic setting at the time of Hartselle deposition. Lastly, we aim to explain the widespread nature of the unit and the processes involved in the formation of the Hartselle sandstone. When taken together, these interpretations will illuminate the age, depositional and tectonic setting of a potential petroleum resource.
Characterizing spatial uncertainty when integrating social data in conservation planning.
Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C
2014-12-01
Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.
Chen, Wei-Yu; Lin, Hsing-Chieh
2018-05-01
Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.
Hinton, Denise; Kirk, Susan
2017-06-01
Background There is growing recognition that multiple sclerosis is a possible, albeit uncommon, diagnosis in childhood. However, very little is known about the experiences of families living with childhood multiple sclerosis and this is the first study to explore this in depth. Objective Our objective was to explore the experiences of parents of children with multiple sclerosis. Methods Qualitative in-depth interviews with 31 parents using a grounded theory approach were conducted. Parents were sampled and recruited via health service and voluntary sector organisations in the United Kingdom. Results Parents' accounts of life with childhood multiple sclerosis were dominated by feelings of uncertainty associated with four sources; diagnostic uncertainty, daily uncertainty, interaction uncertainty and future uncertainty. Parents attempted to manage these uncertainties using specific strategies, which could in turn create further uncertainties about their child's illness. However, over time, ongoing uncertainty appeared to give parents hope for their child's future with multiple sclerosis. Conclusion Illness-related uncertainties appear to play a role in generating hope among parents of a child with multiple sclerosis. However, this may lead parents to avoid sources of information and support that threatens their fragile optimism. Professionals need to be sensitive to the role hope plays in supporting parental coping with childhood multiple sclerosis.
NASA Astrophysics Data System (ADS)
Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu
2018-01-01
Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.
Minimizing Significant Figure Fuzziness.
ERIC Educational Resources Information Center
Fields, Lawrence D.; Hawkes, Stephen J.
1986-01-01
Addresses the principles and problems associated with the use of significant figures. Explains uncertainty, the meaning of significant figures, the Simple Rule, the Three Rule, and the 1-5 Rule. Also provides examples of the Rules. (ML)
Forward Compton scattering with weak neutral current: Constraints from sum rules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorchtein, Mikhail; Zhang, Xilin
2015-06-09
We generalize forward real Compton amplitude to the case of the interference of the electromagnetic and weak neutral current, formulate a low-energy theorem, relate the new amplitudes to the interference structure functions and obtain a new set of sum rules. Furthermore, we address a possible new sum rule that relates the product of the axial charge and magnetic moment of the nucleon to the 0th moment of the structure function g5(ν, 0). For the dispersive γ Z-box correction to the proton’s weak charge, the application of the GDH sum rule allows us to reduce the uncertainty due to resonance contributionsmore » by a factor of two. Finally, the finite energy sum rule helps addressing the uncertainty in that calculation due to possible duality violations.« less
Sterba, Sonya K; Rights, Jason D
2016-01-01
Item parceling remains widely used under conditions that can lead to parcel-allocation variability in results. Hence, researchers may be interested in quantifying and accounting for parcel-allocation variability within sample. To do so in practice, three key issues need to be addressed. First, how can we combine sources of uncertainty arising from sampling variability and parcel-allocation variability when drawing inferences about parameters in structural equation models? Second, on what basis can we choose the number of repeated item-to-parcel allocations within sample? Third, how can we diagnose and report proportions of total variability per estimate arising due to parcel-allocation variability versus sampling variability? This article addresses these three methodological issues. Developments are illustrated using simulated and empirical examples, and software for implementing them is provided.
Study of aerodynamic technology for single-cruise-engine VSTOL fighter/attack aircraft, phase 1
NASA Technical Reports Server (NTRS)
Foley, W. H.; Sheridan, A. E.; Smith, C. W.
1982-01-01
A conceptual design and analysis on a single engine VSTOL fighter/attack aircraft is completed. The aircraft combines a NASA/deHavilland ejector with vectored thrust and is capable of accomplishing the mission and point performance of type Specification 169, and a flight demonstrator could be built with an existing F101/DFE engine. The aerodynamic, aero/propulsive, and propulsive uncertainties are identified, and a wind tunnel program is proposed to address those uncertainties associated with wing borne flight.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...
2015-01-01
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
2014-01-01
characteristics of the boundary layer The studies by Braun and Tao (2000) and Smith and Thomsen (2010) have elevated awareness of an important problem ...estimates also of forecast uncertainty which follow from the uncertainty in not knowing the optimum boundary-layer scheme to use. In an effort to address this...Quarterly Journal of the Royal Meteorological Society Q. J. R. Meteorol. Soc. (2014) DOI:10.1002/qj.2283 An analysis of the observed low-level
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Direct Aerosol Forcing Uncertainty
Mccomiskey, Allison
2008-01-15
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
USDA-ARS?s Scientific Manuscript database
A number of recent soil biota studies have deviated from the standard experimental approach of generating a distinct data value for each experimental unit (e.g. Yang et al., 2013; Gundale et al., 2014). Instead, these studies have mixed together soils from multiple experimental units (i.e. sites wi...
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
De Lara, M; Martinet, V
2009-02-01
Managing natural resources in a sustainable way is a hard task, due to uncertainties, dynamics and conflicting objectives (ecological, social, and economical). We propose a stochastic viability approach to address such problems. We consider a discrete-time control dynamical model with uncertainties, representing a bioeconomic system. The sustainability of this system is described by a set of constraints, defined in practice by indicators - namely, state, control and uncertainty functions - together with thresholds. This approach aims at identifying decision rules such that a set of constraints, representing various objectives, is respected with maximal probability. Under appropriate monotonicity properties of dynamics and constraints, having economic and biological content, we characterize an optimal feedback. The connection is made between this approach and the so-called Management Strategy Evaluation for fisheries. A numerical application to sustainable management of Bay of Biscay nephrops-hakes mixed fishery is given.
Effective UV radiation from model calculations and measurements
NASA Technical Reports Server (NTRS)
Feister, Uwe; Grewe, Rolf
1994-01-01
Model calculations have been made to simulate the effect of atmospheric ozone and geographical as well as meteorological parameters on solar UV radiation reaching the ground. Total ozone values as measured by Dobson spectrophotometer and Brewer spectrometer as well as turbidity were used as input to the model calculation. The performance of the model was tested by spectroradiometric measurements of solar global UV radiation at Potsdam. There are small differences that can be explained by the uncertainty of the measurements, by the uncertainty of input data to the model and by the uncertainty of the radiative transfer algorithms of the model itself. Some effects of solar radiation to the biosphere and to air chemistry are discussed. Model calculations and spectroradiometric measurements can be used to study variations of the effective radiation in space in space time. The comparability of action spectra and their uncertainties are also addressed.
Uncertainty in quantum mechanics: faith or fantasy?
Penrose, Roger
2011-12-13
The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.
Decision analysis of shoreline protection under climate change uncertainty
NASA Astrophysics Data System (ADS)
Chao, Philip T.; Hobbs, Benjamin F.
1997-04-01
If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker andmore » system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.« less
Optimization and resilience in natural resources management
Williams, Byron K.; Johnson, Fred A.
2015-01-01
We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
NASA Technical Reports Server (NTRS)
Head, J. W.; Ivanov, M. A.
1995-01-01
On Venus, global topography shows the presence of highs and lows including regional highly deformed plateaus (tesserae), broad rifted volcanic rises, linear lows flanking uplands, and more equidimensional lowlands (e.g. Lavinia and Atalanta planitiae) Each of these terrain types on Venus has relatively distinctive characteristics, but origins are uncertain in terms of mode of formation, time of formation, and potential evolutionary links. There is a high level of uncertainty about the formation and evolution of lowlands on Venus. We have undertaken the mapping of a specific lowlands region of Venus to address several of these major questions. Using geologic mapping we have tried to establish: What is the sequence of events in the formation and evolution of large-scale equidimensional basins on Venus? When do the compressional features typical of basin interiors occur? What is the total volume of lava that occurs in the basins and is this similar to other non-basin areas? How much subsidence and downwarping has occurred after the last major plains units? WE have undertaken an analysis of the geology of the V55 Lavinia Planitia quadrangle in order to address many of these issues and we report on the results here.
Model calibration criteria for estimating ecological flow characteristics
Vis, Marc; Knight, Rodney; Poole, Sandra; Wolfe, William J.; Seibert, Jan; Breuer, Lutz; Kraft, Philipp
2016-01-01
Quantification of streamflow characteristics in ungauged catchments remains a challenge. Hydrological modeling is often used to derive flow time series and to calculate streamflow characteristics for subsequent applications that may differ from those envisioned by the modelers. While the estimation of model parameters for ungauged catchments is a challenging research task in itself, it is important to evaluate whether simulated time series preserve critical aspects of the streamflow hydrograph. To address this question, seven calibration objective functions were evaluated for their ability to preserve ecologically relevant streamflow characteristics of the average annual hydrograph using a runoff model, HBV-light, at 27 catchments in the southeastern United States. Calibration trials were repeated 100 times to reduce parameter uncertainty effects on the results, and 12 ecological flow characteristics were computed for comparison. Our results showed that the most suitable calibration strategy varied according to streamflow characteristic. Combined objective functions generally gave the best results, though a clear underprediction bias was observed. The occurrence of low prediction errors for certain combinations of objective function and flow characteristic suggests that (1) incorporating multiple ecological flow characteristics into a single objective function would increase model accuracy, potentially benefitting decision-making processes; and (2) there may be a need to have different objective functions available to address specific applications of the predicted time series.
Carrazana González, J; Fernández, I M; Capote Ferrera, E; Rodríguez Castro, G
2008-11-01
Information about how the laboratory of Centro de Protección e Higiene de las Radiaciones (CPHR), Cuba establishes its traceability to the International System of Units for the measurement of radionuclides in environmental test items is presented. A comparison among different methodologies of uncertainty calculation, including an analysis of the feasibility of using the Kragten-spreadsheet approach, is shown. In the specific case of the gamma spectrometric assay, the influence of each parameter, and the identification of the major contributor, in the relative difference between the methods of uncertainty calculation (Kragten and partial derivative) is described. The reliability of the uncertainty calculation results reported by the commercial software Gamma 2000 from Silena is analyzed.
Hawkins, Robert C; Badrick, Tony
2015-08-01
In this study we aimed to compare the reporting unit size used by Australian laboratories for routine chemistry and haematology tests to the unit size used by learned authorities and in standard laboratory textbooks and to the justified unit size based on measurement uncertainty (MU) estimates from quality assurance program data. MU was determined from Royal College of Pathologists of Australasia (RCPA) - Australasian Association of Clinical Biochemists (AACB) and RCPA Haematology Quality Assurance Program survey reports. The reporting unit size implicitly suggested in authoritative textbooks, the RCPA Manual, and the General Serum Chemistry program itself was noted. We also used published data on Australian laboratory practices.The best performing laboratories could justify their chemistry unit size for 55% of analytes while comparable figures for the 50% and 90% laboratories were 14% and 8%, respectively. Reporting unit size was justifiable for all laboratories for red cell count, >50% for haemoglobin but only the top 10% for haematocrit. Few, if any, could justify their mean cell volume (MCV) and mean cell haemoglobin concentration (MCHC) reporting unit sizes.The reporting unit size used by many laboratories is not justified by present analytical performance. Using MU estimates to determine the reporting interval for quantitative laboratory results ensures reporting practices match local analytical performance and recognises the inherent error of the measurement process.
Managing Climate Change Refugia for Biodiversity ...
Climate change threatens to create fundamental shifts in in the distributions and abundances of species. Given projected losses, increased emphasis on management for ecosystem resilience to help buffer fish and wildlife populations against climate change is emerging. Such efforts stake a claim for an adaptive, anticipatory planning response to the climate change threat. To be effective, approaches will need to address critical uncertainties in both the physical basis for projected landscape changes, as well as the biological responses of organisms. Recent efforts define future potential climate refugia based on air temperatures and associated microclimatic changes. These efforts reflect the relatively strong conceptual foundation for linkages between regional climate change and local responses and thermal dynamics. Yet important questions remain. Drawing on case studies, we illustrate some key uncertainties in the responses of species and their habitats to altered hydro-climatic regimes currently not well addressed by physical or ecological models. These uncertainties need not delay anticipatory planning, but rather highlight the need for identification and communication of actions with high probabilities of success, and targeted research within an adaptive management framework.In this workshop, we will showcase the latest science on climate refugia and participants will interact through small group discussions, relevant examples, and facilitated dialogue to i
Kim, Steven B; Kodell, Ralph L; Moon, Hojin
2014-03-01
In chemical and microbial risk assessments, risk assessors fit dose-response models to high-dose data and extrapolate downward to risk levels in the range of 1-10%. Although multiple dose-response models may be able to fit the data adequately in the experimental range, the estimated effective dose (ED) corresponding to an extremely small risk can be substantially different from model to model. In this respect, model averaging (MA) provides more robustness than a single dose-response model in the point and interval estimation of an ED. In MA, accounting for both data uncertainty and model uncertainty is crucial, but addressing model uncertainty is not achieved simply by increasing the number of models in a model space. A plausible set of models for MA can be characterized by goodness of fit and diversity surrounding the truth. We propose a diversity index (DI) to balance between these two characteristics in model space selection. It addresses a collective property of a model space rather than individual performance of each model. Tuning parameters in the DI control the size of the model space for MA. © 2013 Society for Risk Analysis.
Exploring dust emission responses to land cover change using an ecological land classification
NASA Astrophysics Data System (ADS)
Galloza, Magda S.; Webb, Nicholas P.; Bleiweiss, Max P.; Winters, Craig; Herrick, Jeffrey E.; Ayers, Eldon
2018-06-01
Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of land cover change on wind erosion. We apply a dust emission model over a rangeland study area in the northern Chihuahuan Desert, New Mexico, USA, and evaluate spatiotemporal patterns of modelled horizontal sediment mass flux and dust emission in the context of ecological sites and their vegetation states; representing a diversity of land cover types. Our results demonstrate how the impacts of land cover change on dust emission can be quantified, compared across land cover classes, and interpreted in the context of an ecological model that encapsulates land management intensity and change. Results also reveal the importance of established weaknesses in the dust model soil characterisation and drag partition scheme, which appeared generally insensitive to the impacts of land cover change. New models that address these weaknesses, coupled with ecological site concepts and field measurements across land cover types, could significantly reduce assessment uncertainties and provide opportunities for identifying land management options.
2012-06-01
will not involve an element of high risk or uncertainty on the human environment, and its effects on the quality of the human environment are not...Information System HAPs Hazardous air pollutants HAZMAT Hazardous Material HVAC heating, ventilation , and air conditioning HUD U.S. Housing and...Engineers USAF Unites States Air Force USC United States Code USEPA United States Environmental Protection Agency USFWS United States Fish and Wildlife
NASA Astrophysics Data System (ADS)
Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.
2016-12-01
It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support, allowing for a more extensive set of tests over multiple seasons, consequently leading to more robust results. Through the use of these stochastic innovations and powerful supercomputing at NCAR, further insights and advancements in ensemble forecasting at convection-permitting scales will be possible.
Reimbursement of licensed cell and gene therapies across the major European healthcare markets
Jørgensen, Jesper; Kefalas, Panos
2015-01-01
Objective The aim of this research is to identify the pricing, reimbursement, and market access (P&R&MA) considerations most relevant to advanced therapy medicinal products (ATMPs) in the Big5EU, and to inform their manufacturers about the key drivers for securing adoption at a commercially viable reimbursed price. Methodology The research was structured following three main steps: 1) Identifying the market access pathways relevant to ATMPs through secondary research; 2) Validating the secondary research findings and addressing any data gaps in primary research, by qualitative interviews with national, regional, and local-level payers and their clinical and economic advisors; 3) Collating of primary and secondary findings to compare results across countries. Results The incremental clinical benefit forms the basis for all P&R&MA processes. Budget impact is a key consideration, regardless of geography. Cost-effectiveness analyses are increasingly applied; however, only the United Kingdom has a defined threshold that links the cost per quality-adjusted life year (QALY) specifically and methodologically to the reimbursed price. Funding mechanisms to enable adoption of new and more expensive therapies exist in all countries, albeit to varying extents. Willingness to pay is typically higher in smaller patient populations, especially in populations with high disease burden. Outcomes modelling and risk-sharing agreements (RSAs) provide strategies to address the data gap and uncertainties often associated with trials in niche populations. Conclusions The high cost of ATMPs, coupled with the uncertainty at launch around their long-term claims, present challenges for their adoption at a commercially viable reimbursed price. Targeting populations of high disease burden and unmet needs may be advantageous, as the potential for improvement in clinical benefit is greater, as well as the potential for capitalising on healthcare cost offsets. Also, targeting small populations can also help reduce both payers’ budget impact concerns and the risk of reimbursement restrictions being imposed. PMID:27123175
A Panel Study on the Effects of Task Uncertainty, Interdependence , and Size on Unit Decision Making
ERIC Educational Resources Information Center
Van De Ven, Andrew H.
1977-01-01
This panel study examined the determinants of supervisory, employee, and group decision-making in departments or units within a complex organization. Available from: Comparative Administration Research Institute, Kent State University Press, Kent State University, Kent, OH 44242. (Author)
Tian, Zhen; Yuan, Jingqi; Xu, Liang; Zhang, Xiang; Wang, Jingcheng
2018-05-25
As higher requirements are proposed for the load regulation and efficiency enhancement, the control performance of boiler-turbine systems has become much more important. In this paper, a novel robust control approach is proposed to improve the coordinated control performance for subcritical boiler-turbine units. To capture the key features of the boiler-turbine system, a nonlinear control-oriented model is established and validated with the history operation data of a 300 MW unit. To achieve system linearization and decoupling, an adaptive feedback linearization strategy is proposed, which could asymptotically eliminate the linearization error caused by the model uncertainties. Based on the linearized boiler-turbine system, a second-order sliding mode controller is designed with the super-twisting algorithm. Moreover, the closed-loop system is proved robustly stable with respect to uncertainties and disturbances. Simulation results are presented to illustrate the effectiveness of the proposed control scheme, which achieves excellent tracking performance, strong robustness and chattering reduction. Copyright © 2018. Published by Elsevier Ltd.
Error and Uncertainty Analysis for Ecological Modeling and Simulation
2001-12-01
management (LRAM) accounting for environmental, training, and economic factors. In the ELVS methodology, soil erosion status is used as a quantitative...Monte-Carlo approach. The optimization is realized through economic functions or on decision constraints, such as, unit sample cost, number of samples... nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G
Sources and implications of bias and uncertainty in a century of US wildfire activity data
Karen C. Short
2015-01-01
Analyses to identify and relate trends in wildfire activity to factors such as climate, population, land use or land cover and wildland fire policy are increasingly popular in the United States. There is a wealth of US wildfire activity data available for such analyses, but users must be aware of inherent reporting biases, inconsistencies and uncertainty in the data in...
NASA Astrophysics Data System (ADS)
Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.
2016-12-01
Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.
Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning
NASA Astrophysics Data System (ADS)
Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.
2016-12-01
Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.
Modality-Driven Classification and Visualization of Ensemble Variance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bensema, Kevin; Gosink, Luke; Obermaier, Harald
Paper for the IEEE Visualization Conference Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness
NASA Astrophysics Data System (ADS)
Irias, X.; Cicala, D.
2013-12-01
Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is entirely dependent on four potentially fragile water transmission mains for its day-to-day water supply. Using info-gap analysis, EBMUD is evaluating competing strategies for providing water supply to the island, for example submarine pipelines versus tunnels. The analysis considers not only the likely or 'average' results for each strategy, but also the worst-case performance of each strategy under varying levels of uncertainty. This analysis is improving the quality of the planning process, since it can identify strategies that ensure minimal disruption of water supply following a major earthquake, even if the earthquake and resulting damage fail to conform to our expectations. Results to date are presented, including a discussion of how info-gap analysis complements existing tools for comparing alternative strategies, and how info-gap improves our ability to quantify our tolerance for uncertainty.
Application of fuzzy system theory in addressing the presence of uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statisticalmore » approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.« less
Cued uncertainty modulates later recognition of emotional pictures: An ERP study.
Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua
2017-06-01
Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.
Robust Control Design for Uncertain Nonlinear Dynamic Systems
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.
2012-01-01
Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.
NASA Astrophysics Data System (ADS)
Vergara, H. J.; Kirstetter, P.; Gourley, J. J.; Flamig, Z.; Hong, Y.
2015-12-01
The macro scale patterns of simulated streamflow errors are studied in order to characterize uncertainty in a hydrologic modeling system forced with the Multi-Radar/Multi-Sensor (MRMS; http://mrms.ou.edu) quantitative precipitation estimates for flood forecasting over the Conterminous United States (CONUS). The hydrologic model is centerpiece of the Flooded Locations And Simulated Hydrograph (FLASH; http://flash.ou.edu) real-time system. The hydrologic model is implemented at 1-km/5-min resolution to generate estimates of streamflow. Data from the CONUS-wide stream gauge network of the United States' Geological Survey (USGS) were used as a reference to evaluate the discrepancies with the hydrological model predictions. Streamflow errors were studied at the event scale with particular focus on the peak flow magnitude and timing. A total of 2,680 catchments over CONUS and 75,496 events from a 10-year period are used for the simulation diagnostic analysis. Associations between streamflow errors and geophysical factors were explored and modeled. It is found that hydro-climatic factors and radar coverage could explain significant underestimation of peak flow in regions of complex terrain. Furthermore, the statistical modeling of peak flow errors shows that other geophysical factors such as basin geomorphometry, pedology, and land cover/use could also provide explanatory information. Results from this research demonstrate the utility of uncertainty characterization in providing guidance to improve model adequacy, parameter estimates, and input quality control. Likewise, the characterization of uncertainty enables probabilistic flood forecasting that can be extended to ungauged locations.
Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.
Djulbegovic, Benjamin
2011-10-01
In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.
Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics
Djulbegovic, Benjamin
2011-01-01
In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885
Uncertainty prediction for PUB
NASA Astrophysics Data System (ADS)
Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.
2003-04-01
IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.
NASA Astrophysics Data System (ADS)
Bradford, Michael J.
2017-10-01
Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Uncertainty Calculations in the First Introductory Physics Laboratory
NASA Astrophysics Data System (ADS)
Rahman, Shafiqur
2005-03-01
Uncertainty in a measured quantity is an integral part of reporting any experimental data. Consequently, Introductory Physics laboratories at many institutions require that students report the values of the quantities being measured as well as their uncertainties. Unfortunately, given that there are three main ways of calculating uncertainty, each suitable for particular situations (which is usually not explained in the lab manual), this is also an area that students feel highly confused about. It frequently generates large number of complaints in the end-of-the semester course evaluations. Students at some institutions are not asked to calculate uncertainty at all, which gives them a fall sense of the nature of experimental data. Taking advantage of the increased sophistication in the use of computers and spreadsheets that students are coming to college with, we have completely restructured our first Introductory Physics Lab to address this problem. Always in the context of a typical lab, we now systematically and sequentially introduce the various ways of calculating uncertainty including a theoretical understanding as opposed to a cookbook approach, all within the context of six three-hour labs. Complaints about the lab in student evaluations have dropped by 80%. * supported by a grant from A. V. Davis Foundation
Lombardi, A M
2017-09-18
Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Addressing uncertainty in adaptation planning for agriculture.
Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R
2013-05-21
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Bradford, Michael J
2017-10-01
Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.
Addressing uncertainty in adaptation planning for agriculture
Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.
2013-01-01
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley
NASA Technical Reports Server (NTRS)
Rathsam, Jonathan; Ely, Jeffry W.
2012-01-01
A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
NASA Technical Reports Server (NTRS)
Monford, L. G., Jr. (Inventor)
1974-01-01
A digital communication system is reported for parallel operation of 16 or more transceiver units with the use of only four interconnecting wires. A remote synchronization circuit produces unit address control words sequentially in data frames of 16 words. Means are provided in each transceiver unit to decode calling signals and to transmit calling and data signals. The transceivers communicate with each other over one data line. The synchronization unit communicates the address control information to the transceiver units over an address line and further provides the timing information over a clock line. A reference voltage level or ground line completes the interconnecting four wire hookup.
Batey, D Scott; Whitfield, Samantha; Mulla, Mazheruddin; Stringer, Kristi L; Durojaiye, Modupeoluwa; McCormick, Lisa; Turan, Bulent; Nyblade, Laura; Kempf, Mirjam-Colette; Turan, Janet M
2016-11-01
HIV-related stigma has been shown to have profound effects on people living with HIV (PLWH). When stigma is experienced in a healthcare setting, negative health outcomes are exacerbated. We sought to assess the feasibility and acceptability of a healthcare setting stigma-reduction intervention, the Finding Respect and Ending Stigma around HIV (FRESH) Workshop, in the United States. This intervention, adapted from a similar strategy implemented in Africa, brought together healthcare workers (HW) and PLWH to address HIV-related stigma. Two pilot workshops were conducted in Alabama and included 17 HW and 19 PLWH. Participants completed questionnaire measures pre- and post-workshop, including open-ended feedback items. Analytical methods included assessment of measures reliability, pre-post-test comparisons using paired t-tests, and qualitative content analysis. Overall satisfaction with the workshop experience was high, with 87% PLWH and 89% HW rating the workshop "excellent" and the majority agreeing that others like themselves would be interested in participating. Content analysis of open-ended items revealed that participants considered the workshop informative, interactive, well-organized, understandable, fun, and inclusive, while addressing real and prevalent issues. Most pre- and post-test measures had good-excellent internal consistency reliability (Cronbach's alphas ranging from 0.70 to 0.96) and, although sample sizes were small, positive trends were observed, reaching statistical significance for increased awareness of stigma in the health facility among HW (p = 0.047) and decreased uncertainty about HIV treatment among PLWH (p = 0.017). The FRESH intervention appears to be feasible and highly acceptable to HW and PLWH participants and shows great promise as a healthcare setting stigma-reduction intervention for US contexts.
Whitfield, Samantha; Mulla, Mazheruddin; Stringer, Kristi L.; Durojaiye, Modupeoluwa; McCormick, Lisa; Turan, Bulent; Nyblade, Laura; Kempf, Mirjam-Colette; Turan, Janet M.
2016-01-01
Abstract HIV-related stigma has been shown to have profound effects on people living with HIV (PLWH). When stigma is experienced in a healthcare setting, negative health outcomes are exacerbated. We sought to assess the feasibility and acceptability of a healthcare setting stigma-reduction intervention, the Finding Respect and Ending Stigma around HIV (FRESH) Workshop, in the United States. This intervention, adapted from a similar strategy implemented in Africa, brought together healthcare workers (HW) and PLWH to address HIV-related stigma. Two pilot workshops were conducted in Alabama and included 17 HW and 19 PLWH. Participants completed questionnaire measures pre- and post-workshop, including open-ended feedback items. Analytical methods included assessment of measures reliability, pre–post-test comparisons using paired t-tests, and qualitative content analysis. Overall satisfaction with the workshop experience was high, with 87% PLWH and 89% HW rating the workshop “excellent” and the majority agreeing that others like themselves would be interested in participating. Content analysis of open-ended items revealed that participants considered the workshop informative, interactive, well-organized, understandable, fun, and inclusive, while addressing real and prevalent issues. Most pre- and post-test measures had good–excellent internal consistency reliability (Cronbach's alphas ranging from 0.70 to 0.96) and, although sample sizes were small, positive trends were observed, reaching statistical significance for increased awareness of stigma in the health facility among HW (p = 0.047) and decreased uncertainty about HIV treatment among PLWH (p = 0.017). The FRESH intervention appears to be feasible and highly acceptable to HW and PLWH participants and shows great promise as a healthcare setting stigma-reduction intervention for US contexts. PMID:27849373
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Fasanella, Edwin L.; Melis, Matthew; Carney, Kelly; Gabrys, Jonathan
2004-01-01
The Space Shuttle Columbia Accident Investigation Board (CAIB) made several recommendations for improving the NASA Space Shuttle Program. An extensive experimental and analytical program has been developed to address two recommendations related to structural impact analysis. The objective of the present work is to demonstrate the application of probabilistic analysis to assess the effect of uncertainties on debris impacts on Space Shuttle Reinforced Carbon-Carbon (RCC) panels. The probabilistic analysis is used to identify the material modeling parameters controlling the uncertainty. A comparison of the finite element results with limited experimental data provided confidence that the simulations were adequately representing the global response of the material. Five input parameters were identified as significantly controlling the response.
Li, Zhaoying; Zhou, Wenjie; Liu, Hao
2016-09-01
This paper addresses the nonlinear robust tracking controller design problem for hypersonic vehicles. This problem is challenging due to strong coupling between the aerodynamics and the propulsion system, and the uncertainties involved in the vehicle dynamics including parametric uncertainties, unmodeled model uncertainties, and external disturbances. By utilizing the feedback linearization technique, a linear tracking error system is established with prescribed references. For the linear model, a robust controller is proposed based on the signal compensation theory to guarantee that the tracking error dynamics is robustly stable. Numerical simulation results are given to show the advantages of the proposed nonlinear robust control method, compared to the robust loop-shaping control approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less
Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.
Peters, Achim; McEwen, Bruce S; Friston, Karl
2017-09-01
The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Axdahl, E. L.
2017-01-01
Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.
Uncertainty information in climate data records from Earth observation
NASA Astrophysics Data System (ADS)
Merchant, C. J.
2017-12-01
How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.
Hazard Response Modeling Uncertainty (A Quantitative Method)
1988-10-01
was conducted by the National Maritime Institute under contract to the United Kingdom Health and Safety Executive. Instantaneous releases of 2000...the National Maritime Institute under contract to the United Kingdom Health and Safety Executive with the sponsorship of numerous international...WORK UNIT ELEMENT NO. NO. NO. ACESSION NO. _____________________65502F I 00O ne I " 11. TITLE (incl& e Security Oauffication) Hazard Response
Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001
L. S. Heath; R. A. Birdsey; D. W. Williams
2002-01-01
The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...
Can and Should the United States Preserve A Military Capability for Revolutionary Conflict
1972-01-01
unexpected interventions (Korea, Cambodia) and the inherent uncertainties of international relations render dubious any firm prediction that no president in...superpowers or pose a direct military threat. The United States has a strong interest in forestalling any transformation of internal revolts into... internal security assistance (training and advice) draw the United States relentlessly into direct and costly military intervention (deployment of
Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)
NASA Astrophysics Data System (ADS)
Manning, M. R.; Swart, R.
2009-12-01
Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that assessment [IPCC, 2005]. This paper extends a review of the treatment of uncertainty in the IPCC assessments by Swart et al [2009]. It is shown that progress towards consistency has been made but that there also appears to be a need for continued use of several complementary approaches in order to cover the wide range of circumstances across different disciplines involved in climate change. While this reflects the situation in the science community, it also raises the level of complexity for policymakers and other users of the assessments who would prefer one common consensus approach. References IPCC (2005), Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties, IPCC, Geneva. Manning, M., et al. (2004), IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and of Options. IPCC Moss, R., and S. Schneider (2000), Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third Assessment Report of the IPCC, edited by R. Pachauri, et al., Intergovernmental Panel on Climate Change (IPCC), Geneva. Swart, R., et al. (2009), Agreeing to disagree: uncertainty management in assessing climate change, impacts and responses by the IPCC Climatic Change, 92(1-2), 1 - 29.
NASA Technical Reports Server (NTRS)
Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael
2012-01-01
This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.
Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2
NASA Technical Reports Server (NTRS)
Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.;
2016-01-01
Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.
A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders
NASA Astrophysics Data System (ADS)
Malik, Mashkoor; Lurton, Xavier; Mayer, Larry
2018-06-01
Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.
Proceedings of the NASA Conference on Space Telerobotics, volume 1
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)
1989-01-01
The theme of the Conference was man-machine collaboration in space. Topics addressed include: redundant manipulators; man-machine systems; telerobot architecture; remote sensing and planning; navigation; neural networks; fundamental AI research; and reasoning under uncertainty.
The δ2H and δ18O of tap water from 349 sites in the United States and selected territories
Coplen, Tyler B.; Landwehr, Jurate M.; Qi, Haiping; Lorenz, Jennifer M.
2013-01-01
Because the stable isotopic compositions of hydrogen (δ2H) and oxygen (δ18O) of animal (including human) tissues, such as hair, nail, and urine, reflect the δ2H and δ18O of water and food ingested by an animal or a human and because the δ2H and δ18O of environmental waters vary geographically, δ2H and δ18O values of tap water samples collected in 2007-2008 from 349 sites in the United States and three selected U.S. territories have been measured in support of forensic science applications, creating one of the largest databases of tap water δ2H and δ18O values to date. The results of replicate isotopic measurements for these tap water samples confirm that the expanded uncertainties (U = 2μc) obtained over a period of years by the Reston Stable Isotope Laboratory from δ2H and δ18O dual-inlet mass spectrometric measurements are conservative, at ±2‰ and ±0.2 ‰, respectively. These uncertainties are important because U.S. Geological Survey data may be needed for forensic science applications, including providing evidence in court cases. Half way through the investigation, an isotope-laser spectrometer was acquired, enabling comparison of dual-inlet isotope-ratio mass spectrometric results with isotope-laser spectrometric results. The uncertainty of the laser-based δ2H measurement results for these tap water samples is comparable to the uncertainty of the mass spectrometric method, with the laser-based method having a slightly lower uncertainty. However, the δ18O uncertainty of the laser-based method is more than a factor of ten higher than that of the dual-inlet isotoperatio mass spectrometric method.
Kim, Young-Min; Zhou, Ying; Gao, Yang; ...
2014-11-16
We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Young-Min; Zhou, Ying; Gao, Yang
We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less
Direct heuristic dynamic programming for damping oscillations in a large power system.
Lu, Chao; Si, Jennie; Xie, Xiaorong
2008-08-01
This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.
Rodriguez, Brian D.; Sweetkind, Don; Burton, Bethany L.
2010-01-01
The U.S. Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing groundwater contamination resulting from historical underground nuclear testing through the Environmental Management program and, in particular, the Underground Test Area (UGTA) project. From 1951 to 1992, 828 underground nuclear tests were conducted at the Nevada Test Site (NTS) northwest of Las Vegas (DOE UGTA, 2003). Most of these tests were conducted hundreds of feet above the groundwater table; however, more than 200 of the tests were near, or within, the water table. This underground testing was limited to specific areas of the NTS including Pahute Mesa, Rainier Mesa/Shoshone Mountain, Frenchman Flat, and Yucca Flat. Volcanic composite units make up much of the area within the Pahute Mesa Corrective Action Unit (CAU) at the NTS, Nevada. The extent of many of these volcanic composite units extends throughout and south of the primary areas of past underground testing at Pahute and Rainier Mesas. As situated, these units likely influence the rate and direction of groundwater flow and radionuclide transport. Currently, these units are poorly resolved in terms of their hydrologic properties introducing large uncertainties into current CAU-scale flow and transport models. In 2007, the U.S. Geological Survey (USGS), in cooperation with DOE and NNSA-NSO acquired three-dimensional (3-D) tensor magnetotelluric data at the NTS in Area 20 of Pahute Mesa CAU. A total of 20 magnetotelluric recording stations were established at about 600-m spacing on a 3-D array and were tied to ER20-6 well and other nearby well control (fig. 1). The purpose of this survey was to determine if closely spaced 3-D resistivity measurements can be used to characterize the distribution of shallow (600- to 1,500-m-depth range) devitrified rhyolite lava-flow aquifers (LFA) and zeolitic tuff confining units (TCU) in areas of limited drill hole control on Pahute Mesa within the Calico Hills zeolitic volcanic composite unit (VCU), an important hydrostratigraphic unit in Area 20. The resistivity response was evaluated and compared with existing well data and hydrogeologic unit tops from the current Pahute Mesa framework model. In 2008, the USGS processed and inverted the magnetotelluric data into a 3-D resistivity model. We interpreted nine depth slices and four west-east profile cross sections of the 3-D resistivity inversion model. This report documents the geologic interpretation of the 3-D resistivity model. Expectations are that spatial variations in the electrical properties of the Calico Hills zeolitic VCU can be detected and mapped with 3-D resistivity, and that these changes correlate to differences in rock permeability. With regard to LFA and TCU, electrical resistivity and permeability are typically related. Tuff confining units will typically have low electrical resistivity and low permeability, whereas LFA will have higher electrical resistivity and zones of higher fracture-related permeability. If expectations are shown to be correct, the method can be utilized by the UGTA scientists to refine the hydrostratigraphic unit (HSU) framework in an effort to more accurately predict radionuclide transport away from test areas on Pahute and Rainier Mesas.
A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty
NASA Astrophysics Data System (ADS)
Vezer, M. A.
2009-12-01
The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of background considerations relevant to the IPCC and its use of uncertainty expressions. It then (§3) outlines some general philosophical worries regarding vague expressions and (§4) relates those worries to the AR4 and its method of dealing with them, which is a subjective Bayesian probability analysis. The next phase of the paper (§5) examines the notions of ‘objective’ and ‘subjective’ probability interpretations and compares the IPCC’s subjective Bayesian strategy with a frequentist approach. It then (§6) addresses objections to that methodology, and concludes (§7) that those objections are wrongheaded.
Controlling quantum memory-assisted entropic uncertainty in non-Markovian environments
NASA Astrophysics Data System (ADS)
Zhang, Yanliang; Fang, Maofa; Kang, Guodong; Zhou, Qingping
2018-03-01
Quantum memory-assisted entropic uncertainty relation (QMA EUR) addresses that the lower bound of Maassen and Uffink's entropic uncertainty relation (without quantum memory) can be broken. In this paper, we investigated the dynamical features of QMA EUR in the Markovian and non-Markovian dissipative environments. It is found that dynamical process of QMA EUR is oscillation in non-Markovian environment, and the strong interaction is favorable for suppressing the amount of entropic uncertainty. Furthermore, we presented two schemes by means of prior weak measurement and posterior weak measurement reversal to control the amount of entropic uncertainty of Pauli observables in dissipative environments. The numerical results show that the prior weak measurement can effectively reduce the wave peak values of the QMA-EUA dynamic process in non-Markovian environment for long periods of time, but it is ineffectual on the wave minima of dynamic process. However, the posterior weak measurement reversal has an opposite effects on the dynamic process. Moreover, the success probability entirely depends on the quantum measurement strength. We hope that our proposal could be verified experimentally and might possibly have future applications in quantum information processing.
NASA Astrophysics Data System (ADS)
Mulholland, Jonathan; NBL3 Collaboration
2014-09-01
The decay of the free neutron is the prototypical charged current semi-leptonic weak process. A precise value for the neutron lifetime is required for consistency tests of the Standard Model and is needed to predict the primordial He4 abundance from the theory of Big Bang Nucleosynthesis. Plans are being made for an in-beam measurement of the neutron lifetime with an anticipated 0.3s of uncertainty or better. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Advances in neutron fluence measurement, used in to provide the best existing in-beam determination of the neutron lifetime, as well as new silicon detector technology, in use now at LANSCE, address the two largest contributors to the uncertainty of in-beam measurements-the statistical uncertainty associated with proton counting and the systematic uncertainty in the neutron fluence measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.
Robust gaze-steering of an active vision system against errors in the estimated parameters
NASA Astrophysics Data System (ADS)
Han, Youngmo
2015-01-01
Gaze-steering is often used to broaden the viewing range of an active vision system. Gaze-steering procedures are usually based on estimated parameters such as image position, image velocity, depth and camera calibration parameters. However, there may be uncertainties in these estimated parameters because of measurement noise and estimation errors. In this case, robust gaze-steering cannot be guaranteed. To compensate for such problems, this paper proposes a gaze-steering method based on a linear matrix inequality (LMI). In this method, we first propose a proportional derivative (PD) control scheme on the unit sphere that does not use depth parameters. This proposed PD control scheme can avoid uncertainties in the estimated depth and camera calibration parameters, as well as inconveniences in their estimation process, including the use of auxiliary feature points and highly non-linear computation. Furthermore, the control gain of the proposed PD control scheme on the unit sphere is designed using LMI such that the designed control is robust in the presence of uncertainties in the other estimated parameters, such as image position and velocity. Simulation results demonstrate that the proposed method provides a better compensation for uncertainties in the estimated parameters than the contemporary linear method and steers the gaze of the camera more steadily over time than the contemporary non-linear method.
Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A
2015-09-01
Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.
Belote, R Travis; Carroll, Carlos; Martinuzzi, Sebastián; Michalak, Julia; Williams, John W; Williamson, Matthew A; Aplet, Gregory H
2018-06-21
Addressing uncertainties in climate vulnerability remains a challenge for conservation planning. We evaluate how confidence in conservation recommendations may change with agreement among alternative climate projections and metrics of climate exposure. We assessed agreement among three multivariate estimates of climate exposure (forward velocity, backward velocity, and climate dissimilarity) using 18 alternative climate projections for the contiguous United States. For each metric, we classified maps into quartiles for each alternative climate projections, and calculated the frequency of quartiles assigned for each gridded location (high quartile frequency = more agreement among climate projections). We evaluated recommendations using a recent climate adaptation heuristic framework that recommends emphasizing various conservation strategies to land based on current conservation value and expected climate exposure. We found that areas where conservation strategies would be confidently assigned based on high agreement among climate projections varied substantially across regions. In general, there was more agreement in forward and backward velocity estimates among alternative projections than agreement in estimates of local dissimilarity. Consensus of climate predictions resulted in the same conservation recommendation assignments in a few areas, but patterns varied by climate exposure metric. This work demonstrates an approach for explicitly evaluating alternative predictions in geographic patterns of climate change.
Rapid evolution in lekking grouse: Implications for taxonomic definitions
Oyler-McCance, Sara J.; St. John, Judy; Quinn, Thomas W.
2010-01-01
Species and subspecies delineations were traditionally defined by morphological and behavioral traits, as well as by plumage characteristics. Molecular genetic data have more recently been used to assess these classifications and, in many cases, to redefine them. The recent practice of utilizing molecular genetic data to examine taxonomic questions has led some to suggest that molecular genetic methods are more appropriate than traditional methods for addressing taxonomic uncertainty and management units. We compared the North American Tetraoninae—which have been defined using plumage, morphology, and behavior—and considered the effects of redefinition using only neutral molecular genetic data (mitochondrial control region and cytochrome oxidase subunit 1). Using the criterion of reciprocal monophyly, we failed to recognize the five species whose mating system is highly polygynous, with males displaying on leks. In lek-breeding species, sexual selection can act to influence morphological and behavioral traits at a rate much faster than can be tracked genetically. Thus, we suggest that at least for lek-breeding species, it is important to recognize the possibility that morphological and behavioral changes may occur at an accelerated rate compared with the processes that led to reciprocal monophyly of putatively neutral genetic markers. Therefore, it is particularly important to consider the possible disconnect between such lines of evidence when making taxonomic revisions and definitions of management units.
Scale-Free Networks and Commercial Air Carrier Transportation in the United States
NASA Technical Reports Server (NTRS)
Conway, Sheila R.
2004-01-01
Network science, or the art of describing system structure, may be useful for the analysis and control of large, complex systems. For example, networks exhibiting scale-free structure have been found to be particularly well suited to deal with environmental uncertainty and large demand growth. The National Airspace System may be, at least in part, a scalable network. In fact, the hub-and-spoke structure of the commercial segment of the NAS is an often-cited example of an existing scale-free network After reviewing the nature and attributes of scale-free networks, this assertion is put to the test: is commercial air carrier transportation in the United States well explained by this model? If so, are the positive attributes of these networks, e.g. those of efficiency, flexibility and robustness, fully realized, or could we effect substantial improvement? This paper first outlines attributes of various network types, then looks more closely at the common carrier air transportation network from perspectives of the traveler, the airlines, and Air Traffic Control (ATC). Network models are applied within each paradigm, including discussion of implied strengths and weaknesses of each model. Finally, known limitations of scalable networks are discussed. With an eye towards NAS operations, utilizing the strengths and avoiding the weaknesses of scale-free networks are addressed.
Homayoon, D; Dahlhoff, P; Augustin, M
2017-12-15
Uncertainty regarding the suitable amount of prescribed ointment and its application by patients may cause insufficient or uneconomic health care provision. To address this issue, standardized methods and experts' knowledge on the suitable amount and coherent patient's elucidation for application of topicals are needed. Presented are current data in routine care and scientific evidence on the prescribed amount of topical agents as well as its application by patients in dermatological care. A literature review was conducted via PubMed using the keywords as individual and pooled search terms: "local therapy", "topical treatment", "prescription", "amount of ointment needed", "involved area", "BSA", "finger-tip-unit", "Rule of Hand", "calculated dosage" and "rule of nines". We included original studies by manually screening title and abstract according to the relevance of the topic. The search strategy identified 19 clinical trials. The fingertip unit (FTU) is the most frequently used measurement for accurate application of external agents. Appropriate prescribed amount is calculated by required topical agent per involved surface area. There is still a need for clarification to which extent the optimized amount of ointment is prescribed and advice for its application in routine care is given. The FTU combined with the "Rule of Hand" is an adequate measurement for patient's guidance on self-application.
Spooner, Kiara K; Salemi, Jason L; Salihu, Hamisu M; Zoorob, Roger J
2016-05-01
This study aimed to describe disparities and temporal trends in the level of perceived patient-provider communication quality (PPPCQ) in the United States, and to identify sociodemographic and health-related factors associated with elements of PPPCQ. A cross-sectional analysis was conducted using nationally-representative data from the 2011-2013 iterations of the Health Information National Trends Survey (HINTS). Descriptive statistics, multivariable linear and logistic regression analyses were conducted to examine associations. PPPCQ scores, the composite measure of patients' ratings of communication quality, were positive overall (82.8; 95% CI: 82.1-83.5). However, less than half (42-46%) of respondents perceived that providers always addressed their feelings, spent enough time with them, or helped with feelings of uncertainty about their health. Older adults and those with a regular provider consistently had higher PPPCQ scores, while those with poorer perceived general health were consistently less likely to have positive perceptions of their providers' communication behaviors. Disparities in PPPCQ can be attributed to patients' age, race/ethnicity, educational attainment, employment status, income, healthcare access and general health. These findings may inform educational and policy efforts which aim to improve patient-provider communication, enhance the quality of care, and reduce health disparities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Allen, Peg; Jacob, Rebekah R; Lakshman, Meenakshi; Best, Leslie A; Bass, Kathryn; Brownson, Ross C
2018-03-02
Evidence-based public health (EBPH) practice, also called evidence-informed public health, can improve population health and reduce disease burden in populations. Organizational structures and processes can facilitate capacity-building for EBPH in public health agencies. This study involved 51 structured interviews with leaders and program managers in 12 state health department chronic disease prevention units to identify factors that facilitate the implementation of EBPH. Verbatim transcripts of the de-identified interviews were consensus coded in NVIVO qualitative software. Content analyses of coded texts were used to identify themes and illustrative quotes. Facilitator themes included leadership support within the chronic disease prevention unit and division, unit processes to enhance information sharing across program areas and recruitment and retention of qualified personnel, training and technical assistance to build skills, and the ability to provide support to external partners. Chronic disease prevention leaders' role modeling of EBPH processes and expectations for staff to justify proposed plans and approaches were key aspects of leadership support. Leaders protected staff time in order to identify and digest evidence to address the common barrier of lack of time for EBPH. Funding uncertainties or budget cuts, lack of political will for EBPH, and staff turnover remained challenges. In conclusion, leadership support is a key facilitator of EBPH capacity building and practice. Section and division leaders in public health agencies with authority and skills can institute management practices to help staff learn and apply EBPH processes and spread EBPH with partners.
Autonomy gone awry: a cross-cultural study of parents' experiences in neonatal intensive care units.
Orfali, Kristina; Gordon, Elisa J
2004-01-01
This paper examines parents' experiences of medical decision-making and coping with having a critically ill baby in the Neonatal Intensive Care Unit (NICU) from a cross-cultural perspective (France vs. U.S.A.). Though parents' experiences in the NICU were very similar despite cultural and institutional differences, each system addresses their needs in a different way. Interviews with parents show that French parents expressed overall higher satisfaction with the care of their babies and were better able to cope with the loss of their child than American parents. Central to the French parents' perception of autonomy and their sense of satisfaction were the strong doctor-patient relationship, the emphasis on medical certainty in prognosis versus uncertainty in the American context, and the "sentimental work" provided by the team. The American setting, characterized by respect for parental autonomy, did not necessarily translate into full parental involvement in decision-making, and it limited the rapport between doctors and parents to the extent of parental isolation. This empirical comparative approach fosters a much-needed critique of philosophical principles by underscoring, from the parents' perspective, the lack of "emotional work" involved in the practice of autonomy in the American unit compared to the paternalistic European context. Beyond theoretical and ethical arguments, we must reconsider the practice of autonomy in particularly stressful situations by providing more specific means to cope, translating the impersonal language of "rights" and decision-making into trusting, caring relationships, and sharing the responsibility for making tragic choices.
NASA Astrophysics Data System (ADS)
Reyes, J.; Vizuete, W.; Serre, M. L.; Xu, Y.
2015-12-01
The EPA employs a vast monitoring network to measure ambient PM2.5 concentrations across the United States with one of its goals being to quantify exposure within the population. However, there are several areas of the country with sparse monitoring spatially and temporally. One means to fill in these monitoring gaps is to use PM2.5 modeled estimates from Chemical Transport Models (CTMs) specifically the Community Multi-scale Air Quality (CMAQ) model. CMAQ is able to provide complete spatial coverage but is subject to systematic and random error due to model uncertainty. Due to the deterministic nature of CMAQ, often these uncertainties are not quantified. Much effort is employed to quantify the efficacy of these models through different metrics of model performance. Currently evaluation is specific to only locations with observed data. Multiyear studies across the United States are challenging because the error and model performance of CMAQ are not uniform over such large space/time domains. Error changes regionally and temporally. Because of the complex mix of species that constitute PM2.5, CMAQ error is also a function of increasing PM2.5 concentration. To address this issue we introduce a model performance evaluation for PM2.5 CMAQ that is regionalized and non-linear. This model performance evaluation leads to error quantification for each CMAQ grid. Areas and time periods of error being better qualified. The regionalized error correction approach is non-linear and is therefore more flexible at characterizing model performance than approaches that rely on linearity assumptions and assume homoscedasticity of CMAQ predictions errors. Corrected CMAQ data are then incorporated into the modern geostatistical framework of Bayesian Maximum Entropy (BME). Through cross validation it is shown that incorporating error-corrected CMAQ data leads to more accurate estimates than just using observed data by themselves.
Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models
NASA Astrophysics Data System (ADS)
Wellmann, J. Florian; Regenauer-Lieb, Klaus
2012-03-01
Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.
Peters, Clark M; Sherraden, Margaret; Kuchinski, Ann Marie
2016-10-01
The study reported in this article explores the role child welfare workers play in elevating the financial capability (FC) of foster youths transitioning to adulthood. It draws on an examination of Opportunity Passport, a component of the Jim Casey Youth Opportunities Initiative, which operates across the United States. The authors held in-depth, structured interviews with eight staff and 38 current and former foster youths age 18 years and older in four sites across three states. Findings indicate that (a) program participants require professional financial assistance that is beyond the role of the traditional child welfare caseworker; (b) caseworkers who address FC in young adults face uncertainty in their roles; and (c) broader policies relevant to young adults transitioning to adulthood exhibit tension, if not conflict, regarding enhancing FC. The authors highlight the importance of expanding the role of caseworkers to incorporate elements of FC in serving the needs of foster youths.
NASA Technical Reports Server (NTRS)
Weber, William J., III; Gray, Valerie W.; Jackson, Byron; Steele, Laura C.
1991-01-01
This paper discusss the systems approach taken by NASA and the Jet Propulsion Laboratory in the commercialization of land-mobile satellite services (LMSS) in the United States. As the lead center for NASA's Mobile Satellite Program, JPL was involved in identifying and addressing many of the key barriers to commercialization of mobile satellite communications, including technical, economic, regulatory and institutional risks, or uncertainties. The systems engineering approach described here was used to mitigate these risks. The result was the development and implementation of the JPL Mobile Satellite Experiment Project. This Project included not only technology development, but also studies to support NASA in the definition of the regulatory, market, and investment environments within which LMSS would evolve and eventually operate, as well as initiatives to mitigate their associated commercialization risks. The end result of these government-led endeavors was the acceleration of the introduction of commercial mobile satellite services, both nationally and internationally.
Oppenheimer, Gerald M.; Benrubi, I. Daniel
2014-01-01
For decades, public health advocates have confronted industry over dietary policy, their debates focusing on how to address evidentiary uncertainty. In 1977, enough consensus existed among epidemiologists that the Senate Select Committee on Nutrition and Human Need used the diet–heart association to perform an extraordinary act: advocate dietary goals for a healthier diet. During its hearings, the meat industry tested that consensus. In one year, the committee produced two editions of its Dietary Goals for the United States, the second containing a conciliatory statement about coronary heart disease and meat consumption. Critics have characterized the revision as a surrender to special interests. But the senators faced issues for which they were professionally unprepared: conflicts within science over the interpretation of data and notions of proof. Ultimately, it was lack of scientific consensus on these factors, not simply political acquiescence, that allowed special interests to secure changes in the guidelines. PMID:24228658
Noise in pressure transducer readings produced by variations in solar radiation
Cain, S. F.; Davis, G.A.; Loheide, Steven P.; Butler, J.J.
2004-01-01
Variations in solar radiation can produce noise in readings from gauge pressure transducers when the transducer cable is exposed to direct sunlight. This noise is a result of insolation-induced heating and cooling of the air column in the vent tube of the transducer cable. A controlled experiment was performed to assess the impact of variations in solar radiation on transducer readings. This experiment demonstrated that insolation-induced fluctuations in apparent pressure head can be as large as 0.03 m. The magnitude of these fluctuations is dependent on cable color, the diameter of the vent tube, and the length of the transducer cable. The most effective means of minimizing insolation-induced noise is to use integrated transducer-data logger units that fit within a well. Failure to address this source of noise can introduce considerable uncertainty into analyses of hydraulic tests when the head change is relatively small, as is often the case for tests in highly permeable aquifers or for tests using distant observation wells.
Hell and High Water: Practice-Relevant Adaptation Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moss, Richard H.; Meehl, G.; Lemos, Maria Carmen
2013-11-08
Recent extreme weather such as Hurricane Sandy and the 2012 drought demonstrate the vulnerability of the United States to climate extremes in the present and point to the potential for increased future damages under a changing climate. They also provide lessons for reducing harm and realizing any potential benefits. Preparedness measures – also referred to as adaptation – can cost-effectively increase resilience today and in the future. The upfront costs will be more than offset by reductions in property damage, lives and livelihoods lost, and expensive post-disaster recovery processes. While others have addressed use of science for adaptation in specificmore » sectors including biodiversity (Heller and Zavaleta, 2009) and freshwater ecosystem management (Wilby et al., 2010), or by simply taking a more pragmatic approach to adaptation under uncertainty (Hallegatte, 2009), here the authors make the case that a new, comprehensive approach is needed to create and use science to inform adaptations with applicable and sound knowledge (Kerr et al., 2011).« less
Entering the New Millennium: Dilemmas in Arms Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN,JAMES
The end of the Cold War finds the international community no longer divided into two opposing blocks. The concerns that the community now faces are becoming more fluid, less focused, and, in many ways, much less predictable. Issues of religion, ethnicity, and nationalism; the possible proliferation of Weapons of Mass Destruction; and the diffusion of technology and information processing throughout the world community have greatly changed the international security landscape in the last decade. Although our challenges appear formidable, the United Nations, State Parties, nongovernmental organizations, and the arms control community are moving to address and lessen these concerns throughmore » both formal and informal efforts. Many of the multilateral agreements (e.g., NPT, BWC, CWC, CTBT, MTCR), as well as the bilateral efforts that are taking place between Washington and Moscow employ confidence-building and transparency measures. These measures along with on-site inspection and other verification procedures lessen suspicion and distrust and reduce uncertainty, thus enhancing stability, confidence, and cooperation.« less
Lee, Duncan; Mukhopadhyay, Sabyasachi; Rushworth, Alastair; Sahu, Sujit K
2017-04-01
In the United Kingdom, air pollution is linked to around 40000 premature deaths each year, but estimating its health effects is challenging in a spatio-temporal study. The challenges include spatial misalignment between the pollution and disease data; uncertainty in the estimated pollution surface; and complex residual spatio-temporal autocorrelation in the disease data. This article develops a two-stage model that addresses these issues. The first stage is a spatio-temporal fusion model linking modeled and measured pollution data, while the second stage links these predictions to the disease data. The methodology is motivated by a new five-year study investigating the effects of multiple pollutants on respiratory hospitalizations in England between 2007 and 2011, using pollution and disease data relating to local and unitary authorities on a monthly time scale. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Life cycle evaluation of emerging lignocellulosic ethanol conversion technologies.
Spatari, Sabrina; Bagley, David M; MacLean, Heather L
2010-01-01
Lignocellulosic ethanol holds promise for addressing climate change and energy security issues associated with personal transportation through lowering the fuel mixes' carbon intensity and petroleum demand. We compare the technological features and life cycle environmental impacts of near- and mid-term ethanol bioconversion technologies in the United States. Key uncertainties in the major processes: pre-treatment, hydrolysis, and fermentation are evaluated. The potential to reduce fossil energy use and greenhouse gas (GHG) emissions varies among bioconversion processes, although all options studied are considerably more attractive than gasoline. Anticipated future performance is found to be considerably more attractive than that published in the literature as being achieved to date. Electricity co-product credits are important in characterizing the GHG impacts of different ethanol production pathways; however, in the absence of near-term liquid transportation fuel alternatives to gasoline, optimizing ethanol facilities to produce ethanol (as opposed to co-products) is important for reducing the carbon intensity of the road transportation sector and for energy security.
Noise in pressure transducer readings produced by variations in solar radiation.
Cain, Samuel F; Davis, Gregory A; Loheide, Steven P; Butler, James J
2004-01-01
Variations in solar radiation can produce noise in readings from gauge pressure transducers when the transducer cable is exposed to direct sunlight. This noise is a result of insolation-induced heating and cooling of the air column in the vent tube of the transducer cable. A controlled experiment was performed to assess the impact of variations in solar radiation on transducer readings. This experiment demonstrated that insolation-induced fluctuations in apparent pressure head can be as large as 0.03 m. The magnitude of these fluctuations is dependent on cable color, the diameter of the vent tube, and the length of the transducer cable. The most effective means of minimizing insolation-induced noise is to use integrated transducer-data logger units that fit within a well. Failure to address this source of noise can introduce considerable uncertainty into analyses of hydraulic tests when the head change is relatively small, as is often the case for tests in highly permeable aquifers or for tests using distant observation wells.
Bayesian operational modal analysis with asynchronous data, Part II: Posterior uncertainty
NASA Astrophysics Data System (ADS)
Zhu, Yi-Chen; Au, Siu-Kui
2018-01-01
A Bayesian modal identification method has been proposed in the companion paper that allows the most probable values of modal parameters to be determined using asynchronous ambient vibration data. This paper investigates the identification uncertainty of modal parameters in terms of their posterior covariance matrix. Computational issues are addressed. Analytical expressions are derived to allow the posterior covariance matrix to be evaluated accurately and efficiently. Synthetic, laboratory and field data examples are presented to verify the consistency, investigate potential modelling error and demonstrate practical applications.
Evaluation of the BioVapor Model
The BioVapor model addresses transport and biodegradation of petroleum vapors in the subsurface. This presentation describes basic background on the nature and scientific basis of environmental transport models. It then describes a series of parameter uncertainty runs of the Bi...
Probabilistic population projections with migration uncertainty
Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571
Landscape change in the southern Piedmont: challenges, solutions, and uncertainty across scales
Conroy, M.J.; Allen, Craig R.; Peterson, J.T.; Pritchard, L.J.; Moore, C.T.
2003-01-01
The southern Piedmont of the southeastern United States epitomizes the complex and seemingly intractable problems and hard decisions that result from uncontrolled urban and suburban sprawl. Here we consider three recurrent themes in complicated problems involving complex systems: (1) scale dependencies and cross-scale, often nonlinear relationships; (2) resilience, in particular the potential for complex systems to move to alternate stable states with decreased ecological and/or economic value; and (3) uncertainty in the ability to understand and predict outcomes, perhaps particularly those that occur as a result of human impacts. We consider these issues in the context of landscape-level decision making, using as an example water resources and lotic systems in the Piedmont region of the southeastern United States.
USDA-ARS?s Scientific Manuscript database
Few studies have attempted to quantify mass balances of both pesticides and degradates in multiple agricultural settings of the United States. We used inverse modeling to calibrate the Root Zone Water Quality Model (RZWQM) for predicting the unsaturated-zone transport and fate of metolachlor, metola...
Assessment of grassland ecosystem conditions in the Southwestern United States. Vol. 1
Deborah M. Finch
2004-01-01
This report is volume 1 of a two-volume ecological assessment of grassland ecosystems in the Southwestern United States. Broadscale assessments are syntheses of current scientific knowledge, including a description of uncertainties and assumptions, to provide a characterization and comprehensive description of ecological, social, and economic components within an...
Leadership Development in Governments of the United Arab Emirates: Re-Framing a Wicked Problem
ERIC Educational Resources Information Center
Mathias, Megan
2017-01-01
Developing the next generation of leaders in government is seen as a strategic challenge of national importance in the United Arab Emirates (UAE). This article examines the wicked nature of the UAE's leadership development challenge, identifying patterns of complexity, uncertainty, and divergence in the strategic intentions underlying current…
Effects of Phasor Measurement Uncertainty on Power Line Outage Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chen; Wang, Jianhui; Zhu, Hao
2014-12-01
Phasor measurement unit (PMU) technology provides an effective tool to enhance the wide-area monitoring systems (WAMSs) in power grids. Although extensive studies have been conducted to develop several PMU applications in power systems (e.g., state estimation, oscillation detection and control, voltage stability analysis, and line outage detection), the uncertainty aspects of PMUs have not been adequately investigated. This paper focuses on quantifying the impact of PMU uncertainty on power line outage detection and identification, in which a limited number of PMUs installed at a subset of buses are utilized to detect and identify the line outage events. Specifically, the linemore » outage detection problem is formulated as a multi-hypothesis test, and a general Bayesian criterion is used for the detection procedure, in which the PMU uncertainty is analytically characterized. We further apply the minimum detection error criterion for the multi-hypothesis test and derive the expected detection error probability in terms of PMU uncertainty. The framework proposed provides fundamental guidance for quantifying the effects of PMU uncertainty on power line outage detection. Case studies are provided to validate our analysis and show how PMU uncertainty influences power line outage detection.« less
Knotts, Thomas A.
2017-01-01
Molecular simulation has the ability to predict various physical properties that are difficult to obtain experimentally. For example, we implement molecular simulation to predict the critical constants (i.e., critical temperature, critical density, critical pressure, and critical compressibility factor) for large n-alkanes that thermally decompose experimentally (as large as C48). Historically, molecular simulation has been viewed as a tool that is limited to providing qualitative insight. One key reason for this perceived weakness in molecular simulation is the difficulty to quantify the uncertainty in the results. This is because molecular simulations have many sources of uncertainty that propagate and are difficult to quantify. We investigate one of the most important sources of uncertainty, namely, the intermolecular force field parameters. Specifically, we quantify the uncertainty in the Lennard-Jones (LJ) 12-6 parameters for the CH4, CH3, and CH2 united-atom interaction sites. We then demonstrate how the uncertainties in the parameters lead to uncertainties in the saturated liquid density and critical constant values obtained from Gibbs Ensemble Monte Carlo simulation. Our results suggest that the uncertainties attributed to the LJ 12-6 parameters are small enough that quantitatively useful estimates of the saturated liquid density and the critical constants can be obtained from molecular simulation. PMID:28527455
The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty
NASA Astrophysics Data System (ADS)
Possolo, Antonio
2016-11-01
NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.
Uncertainty and psychological adjustment in patients with lung cancer
Kurita, Keiko; Garon, Edward B.; Stanton, Annette L.; Meyerowitz, Beth E.
2014-01-01
Background For many patients with lung cancer, disease progression occurs without notice or with vague symptoms, and unfortunately, most treatments are not curative. Given this unpredictability, we hypothesized the following: (1) poorer psychological adjustment (specifically, more depressive symptoms, higher perceptions of stress, and poorer emotional well-being) would be associated with higher intolerance for uncertainty, higher perceived illness-related ambiguity, and their interaction; and (2) greater avoidance would mediate associations between higher intolerance of uncertainty and poorer psychological adjustment. Methods Participants (N = 49) diagnosed with lung cancer at least 6 months prior to enrollment completed the Center for Epidemiologic Studies – Depression Scale, the Functional Assessment of Cancer Therapy – Lung Emotional Well-being subscale, the Perceived Stress scale, the Intolerance of Uncertainty scale, the Mishel Uncertainty in Illness Scale Ambiguity subscale, the Impact of Event – Revised Avoidance subscale, and the Short-scale Eysenck Personality Questionnaire – Revised Neuroticism subscale. Mean age was 64.2 years (standard deviation [SD] = 11.0), mean years of education was 15.6 (SD = 3.1), and 71.4% were female. Hypotheses were tested with regression analyses, adjusted for neuroticism. Results Higher perceptions of stress and poorer emotional well-being were associated with higher levels of intolerance of uncertainty and higher perceived illness-related ambiguity. Non-somatic depressive symptoms were associated with higher levels of intolerance of uncertainty. Avoidance was found to mediate relations of intolerance of uncertainty with non-somatic depressive symptoms and emotional well-being only. Conclusions Findings suggest that interventions to address avoidance and intolerance of uncertainty in individuals with lung cancer may help improve psychological adjustment. PMID:22887017
NASA Astrophysics Data System (ADS)
Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.
2018-03-01
In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.
Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.
Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C
2015-01-01
The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.
NASA Astrophysics Data System (ADS)
Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.
2013-05-01
Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.
Lopiano, Kenneth K; Young, Linda J; Gotway, Carol A
2014-09-01
Spatially referenced datasets arising from multiple sources are routinely combined to assess relationships among various outcomes and covariates. The geographical units associated with the data, such as the geographical coordinates or areal-level administrative units, are often spatially misaligned, that is, observed at different locations or aggregated over different geographical units. As a result, the covariate is often predicted at the locations where the response is observed. The method used to align disparate datasets must be accounted for when subsequently modeling the aligned data. Here we consider the case where kriging is used to align datasets in point-to-point and point-to-areal misalignment problems when the response variable is non-normally distributed. If the relationship is modeled using generalized linear models, the additional uncertainty induced from using the kriging mean as a covariate introduces a Berkson error structure. In this article, we develop a pseudo-penalized quasi-likelihood algorithm to account for the additional uncertainty when estimating regression parameters and associated measures of uncertainty. The method is applied to a point-to-point example assessing the relationship between low-birth weights and PM2.5 levels after the onset of the largest wildfire in Florida history, the Bugaboo scrub fire. A point-to-areal misalignment problem is presented where the relationship between asthma events in Florida's counties and PM2.5 levels after the onset of the fire is assessed. Finally, the method is evaluated using a simulation study. Our results indicate the method performs well in terms of coverage for 95% confidence intervals and naive methods that ignore the additional uncertainty tend to underestimate the variability associated with parameter estimates. The underestimation is most profound in Poisson regression models. © 2014, The International Biometric Society.
On the uncertainty of interdisciplinarity measurements due to incomplete bibliographic data.
Calatrava Moreno, María Del Carmen; Auzinger, Thomas; Werthner, Hannes
The accuracy of interdisciplinarity measurements is directly related to the quality of the underlying bibliographic data. Existing indicators of interdisciplinarity are not capable of reflecting the inaccuracies introduced by incorrect and incomplete records because correct and complete bibliographic data can rarely be obtained. This is the case for the Rao-Stirling index, which cannot handle references that are not categorized into disciplinary fields. We introduce a method that addresses this problem. It extends the Rao-Stirling index to acknowledge missing data by calculating its interval of uncertainty using computational optimization. The evaluation of our method indicates that the uncertainty interval is not only useful for estimating the inaccuracy of interdisciplinarity measurements, but it also delivers slightly more accurate aggregated interdisciplinarity measurements than the Rao-Stirling index.
A review of the generalized uncertainty principle.
Tawfik, Abdel Nasser; Diab, Abdel Magied
2015-12-01
Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.
Robust guaranteed cost tracking control of quadrotor UAV with uncertainties.
Xu, Zhiwei; Nian, Xiaohong; Wang, Haibo; Chen, Yinsheng
2017-07-01
In this paper, a robust guaranteed cost controller (RGCC) is proposed for quadrotor UAV system with uncertainties to address set-point tracking problem. A sufficient condition of the existence for RGCC is derived by Lyapunov stability theorem. The designed RGCC not only guarantees the whole closed-loop system asymptotically stable but also makes the quadratic performance level built for the closed-loop system have an upper bound irrespective to all admissible parameter uncertainties. Then, an optimal robust guaranteed cost controller is developed to minimize the upper bound of performance level. Simulation results verify the presented control algorithms possess small overshoot and short setting time, with which the quadrotor has ability to perform set-point tracking task well. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ingale, S. V.; Datta, D.
2010-10-01
Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SA Edgerton; LR Roeder
The Earth’s surface temperature is determined by the balance between incoming solar radiation and thermal (or infrared) radiation emitted by the Earth back to space. Changes in atmospheric composition, including greenhouse gases, clouds, and aerosols can alter this balance and produce significant climate change. Global climate models (GCMs) are the primary tool for quantifying future climate change; however, there remain significant uncertainties in the GCM treatment of clouds, aerosol, and their effects on the Earth’s energy balance. The 2007 assessment (AR4) by the Intergovernmental Panel on Climate Change (IPCC) reports a substantial range among GCMs in climate sensitivity to greenhousemore » gas emissions. The largest contributor to this range lies in how different models handle changes in the way clouds absorb or reflect radiative energy in a changing climate (Solomon et al. 2007). In 1989, the U.S. Department of Energy (DOE) Office of Science created the Atmospheric Radiation Measurement (ARM) Program within the Office of Biological and Environmental Research (BER) to address scientific uncertainties related to global climate change, with a specific focus on the crucial role of clouds and their influence on the transfer of radiation in the atmosphere. To address this problem, BER has adopted a unique two-pronged approach: * The ARM Climate Research Facility (ACRF), a scientific user facility for obtaining long-term measurements of radiative fluxes, cloud and aerosol properties, and related atmospheric characteristics in diverse climate regimes. * The ARM Science Program, focused on the analysis of ACRF data to address climate science issues associated with clouds, aerosols, and radiation, and to improve GCMs. This report describes accomplishments of the BER ARM Program toward addressing the primary uncertainties related to climate change prediction as identified by the IPCC.« less
Environmental trade-offs of tunnels vs cut-and-cover subways
Walton, M.
1978-01-01
Heavy construction projects in cities entail two kinds of cost - internal cost, which can be defined in terms of payments from one set of parties to another, and external cost, which is the cost borne by the community at large as the result of disutilities entailed in construction and operation. Environmental trade-offs involve external costs, which are commonly difficult to measure. Cut-and-cover subway construction probably entails higher external and internal cost than deep tunnel construction in many urban geological environments, but uncertainty concerning the costs and environmental trade-offs of tunneling leads to limited and timid use of tunneling by American designers. Thus uncertainty becomes a major trade-off which works against tunneling. The reverse is true in Sweden after nearly 30 years of subway construction. Econometric methods for measuring external costs exist in principle, but are limited in application. Economic theory based on market pressure does not address the real problem of urban environmental trade-offs. Nevertheless, the problem of uncertainty can be addressed by comparative studies of estimated and as-built costs of cut-and-cover vs tunnel projects and a review of environmental issues associated with such construction. Such a study would benefit the underground construction industry and the design of transportation systems. It would also help solve an aspect of the urban problem. ?? 1978.
NASA Astrophysics Data System (ADS)
Zheng, Yingying
The growing energy demands and needs for reducing carbon emissions call more and more attention to the development of renewable energy technologies and management strategies. Microgrids have been developed around the world as a means to address the high penetration level of renewable generation and reduce greenhouse gas emissions while attempting to address supply-demand balancing at a more local level. This dissertation presents a model developed to optimize the design of a biomass-integrated renewable energy microgrid employing combined heat and power with energy storage. A receding horizon optimization with Monte Carlo simulation were used to evaluate optimal microgrid design and dispatch under uncertainties in the renewable energy and utility grid energy supplies, the energy demands, and the economic assumptions so as to generate a probability density function for the cost of energy. Case studies were examined for a conceptual utility grid-connected microgrid application in Davis, California. The results provide the most cost effective design based on the assumed energy load profile, local climate data, utility tariff structure, and technical and financial performance of the various components of the microgrid. Sensitivity and uncertainty analyses are carried out to illuminate the key parameters that influence the energy costs. The model application provides a means to determine major risk factors associated with alternative design integration and operating strategies.
Conditional load and store in a shared memory
Blumrich, Matthias A; Ohmacht, Martin
2015-02-03
A method, system and computer program product for implementing load-reserve and store-conditional instructions in a multi-processor computing system. The computing system includes a multitude of processor units and a shared memory cache, and each of the processor units has access to the memory cache. In one embodiment, the method comprises providing the memory cache with a series of reservation registers, and storing in these registers addresses reserved in the memory cache for the processor units as a result of issuing load-reserve requests. In this embodiment, when one of the processor units makes a request to store data in the memory cache using a store-conditional request, the reservation registers are checked to determine if an address in the memory cache is reserved for that processor unit. If an address in the memory cache is reserved for that processor, the data are stored at this address.
Foresight for commanders: a methodology to assist planning for effects-based operations
NASA Astrophysics Data System (ADS)
Davis, Paul K.; Kahan, James P.
2006-05-01
Looking at the battlespace as a system of systems is a cornerstone of Effects-Based Operations and a key element in the planning of such operations, and in developing the Commander's Predictive Environment. Instead of a physical battleground to be approached with weapons of force, the battlespace is an interrelated super-system of political, military, economic, social, information and infrastructure systems to be approached with diplomatic, informational, military and economic actions. A concept that has proved useful in policy arenas other than defense, such as research and development for information technology, addressing cybercrime, and providing appropriate and cost-effective health care, is foresight. In this paper, we provide an overview of how the foresight approach addresses the inherent uncertainties in planning courses of action, present a set of steps in the conduct of foresight, and then illustrate the application of foresight to a commander's decision problem. We conclude that foresight approach that we describe is consistent with current doctrinal intelligence preparation of the battlespace and operational planning, but represents an advance in that it explicitly addresses the uncertainties in the environment and planning in a way that identifies strategies that are robust over different possible ground truths. It should supplement other planning methods.
Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information
NASA Technical Reports Server (NTRS)
Brall, Aron
2013-01-01
This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.
Measuring Overcast Colors with All-Sky Imaging
2008-04-01
NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) United States Naval Academy (USNA),Mathematics...Science Department,Annapolis,MD,21402 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR...are vestigial (29 No- vember 2006 curve). A few overcasts are bluest near the horizon, and this causes particularly large colori- metric excursions
Managing the Risks of Climate Change and Terrorism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosa, Eugene; Dietz, Tom; Moss, Richard H.
2012-04-07
The article describes challenges to comparative risk assessment, a key approach for managing uncertainty in decision making, across diverse threats such as terrorism and climate change and argues new approaches will be particularly important in addressing decisions related to sustainability.
BioVapor Model Evaluation (St. Louis, MO)
The BioVapor model addresses transport and biodegradation of petroleum vapors in the subsurface. This presentation describes basic background on the nature and scientific basis of environmental transport models. It then describes a series of parameter uncertainty runs of the Bi...
Everglades Collaborative Adaptive Management Program Progress
When the Comprehensive Everglades Restoration Plan (CERP) was authorized in 2000, adaptive management (AM) was recognized as a necessary tool to address uncertainty in achieving the broad goals and objectives for restoring a highly managed system. The Everglades covers18,000 squ...
Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops
NASA Astrophysics Data System (ADS)
Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said
2017-11-01
The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.
Hu, X H; Li, Y P; Huang, G H; Zhuang, X W; Ding, X W
2016-05-01
In this study, a Bayesian-based two-stage inexact optimization (BTIO) method is developed for supporting water quality management through coupling Bayesian analysis with interval two-stage stochastic programming (ITSP). The BTIO method is capable of addressing uncertainties caused by insufficient inputs in water quality model as well as uncertainties expressed as probabilistic distributions and interval numbers. The BTIO method is applied to a real case of water quality management for the Xiangxi River basin in the Three Gorges Reservoir region to seek optimal water quality management schemes under various uncertainties. Interval solutions for production patterns under a range of probabilistic water quality constraints have been generated. Results obtained demonstrate compromises between the system benefit and the system failure risk due to inherent uncertainties that exist in various system components. Moreover, information about pollutant emission is accomplished, which would help managers to adjust production patterns of regional industry and local policies considering interactions of water quality requirement, economic benefit, and industry structure.
Risk intelligence: making profit from uncertainty in data processing system.
Zheng, Si; Liao, Xiangke; Liu, Xiaodong
2014-01-01
In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput.
Risk Intelligence: Making Profit from Uncertainty in Data Processing System
Liao, Xiangke; Liu, Xiaodong
2014-01-01
In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392
Experimental Concepts for Testing Seismic Hazard Models
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Jordan, T. H.
2015-12-01
Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.
Optimal test selection for prediction uncertainty reduction
Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel
2016-12-02
Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less
NASA Astrophysics Data System (ADS)
de Barros, Felipe P. J.; Ezzedine, Souheil; Rubin, Yoram
2012-02-01
The significance of conditioning predictions of environmental performance metrics (EPMs) on hydrogeological data in heterogeneous porous media is addressed. Conditioning EPMs on available data reduces uncertainty and increases the reliability of model predictions. We present a rational and concise approach to investigate the impact of conditioning EPMs on data as a function of the location of the environmentally sensitive target receptor, data types and spacing between measurements. We illustrate how the concept of comparative information yield curves introduced in de Barros et al. [de Barros FPJ, Rubin Y, Maxwell R. The concept of comparative information yield curves and its application to risk-based site characterization. Water Resour Res 2009;45:W06401. doi:10.1029/2008WR007324] could be used to assess site characterization needs as a function of flow and transport dimensionality and EPMs. For a given EPM, we show how alternative uncertainty reduction metrics yield distinct gains of information from a variety of sampling schemes. Our results show that uncertainty reduction is EPM dependent (e.g., travel times) and does not necessarily indicate uncertainty reduction in an alternative EPM (e.g., human health risk). The results show how the position of the environmental target, flow dimensionality and the choice of the uncertainty reduction metric can be used to assist in field sampling campaigns.
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei
2005-01-01
This document addresses calculations of probability distribution functions (PDFs) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPEs). PDFs are used to test the effectiveness of potential radiation shielding approaches. Monte-Carlo techniques are used to propagate uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. The cancer risk uncertainty is about four-fold for lunar and Mars mission risk projections. For short-stay lunar missins (<180 d), SPEs present the most significant risk, but one effectively mitigated by shielding. For long-duration (>180 d) lunar or Mars missions, GCR risks may exceed radiation risk limits. While shielding materials are marginally effective in reducing GCR cancer risks because of the penetrating nature of GCR and secondary radiation produced in tissue by relativisitc particles, polyethylene or carbon composite shielding cannot be shown to significantly reduce risk compared to aluminum shielding. Therefore, improving our knowledge of space radiobiology to narrow uncertainties that lead to wide PDFs is the best approach to ensure radiation protection goals are met for space exploration.
New Assignment of Mass Values and Uncertainties to NIST Working Standards
Davis, Richard S.
1990-01-01
For some time it had been suspected that values assigned to NIST working standards of mass were some 0.17 mg/kg larger than mass values based on artifacts representing mass in the International System of Units (SI). This relatively small offset, now confirmed, has had minimal scientific or technological significance. The discrepancy was removed on January 1, 1990. We document the history of the discrepancy, the studies which allow its removal, and the methods in place to limit its effect and prevent its recurrence. For routine calibrations, we believe that our working standards now have a long-term stability of 0.033 mg/kg (3σ) with respect to the national prototype kilograms of the United States. We provisionally admit an additional uncertainty of 0.09 mg/kg (3σ), systematic to all NIST mass measurements, which represents the possible offset of our primary standards from standards maintained by the Bureau International des Poids et Mesures (BIPM). This systematic uncertainty may be significantly reduced after analysis of results from the 3rd verification of national prototype kilograms, which is now underway. PMID:28179759