Sample records for exist considerable uncertainties

  1. Simulating the Stability of Colloidal Amorphous Iron Oxide in Natural Water

    EPA Science Inventory

    Considerable uncertainty exists as to whether existing thermodynamic equilibrium solid/water partitioning paradigms can be used to assess the mobility of insoluble manufactured nanomaterials in the aquatic environment. In this work, the traditional Derjaguin–Landau–Verwey–Overbee...

  2. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  3. Forecasting eruption size: what we know, what we don't know

    NASA Astrophysics Data System (ADS)

    Papale, Paolo

    2017-04-01

    Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.

  4. PCDD/F EMISSIONS FROM UNCONTROLLED, DOMESTIC WASTE BURNING

    EPA Science Inventory

    Considerable uncertainty exists in the inventory of polychlorinated dibenzodioxin and dibenzofuran (PCDD/F) emissions from controlled combustion sources such as backyard burning of domestic waste. The contribution from these sources to the worldwide PCDD/F balance may be signific...

  5. Challenges of Sustaining the International Space Station through 2020 and Beyond: Including Epistemic Uncertainty in Reassessing Confidence Targets

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.

  6. High voltage system: Plasma interaction summary

    NASA Technical Reports Server (NTRS)

    Stevens, N. John

    1986-01-01

    The possible interactions that could exist between a high voltage system and the space plasma environment are reviewed. A solar array is used as an example of such a system. The emphasis in this review is on the discrepancies that exist in this technology in both flight and ground experiment data. It has been found that, in ground testing, there are facility effects, cell size effects and area scaling uncertainties. For space applications there are area scaling and discharge concerns for an array as well as the influence of the large space structures on the collection process. There are still considerable uncertainties in the high voltage-space plasma interaction technology even after several years of effort.

  7. Effect of spatial image support in detecting long-term vegetation change from satellite time-series

    USDA-ARS?s Scientific Manuscript database

    Context Arid rangelands have been severely degraded over the past century. Multi-temporal remote sensing techniques are ideally suited to detect significant changes in ecosystem state; however, considerable uncertainty exists regarding the effects of changing image resolution on their ability to de...

  8. Deriving persistence indicators from regulatory water-sediment studies – opportunities and limitations in OECD 308 data.

    PubMed

    Honti, Mark; Fenner, Kathrin

    2015-05-19

    The OECD guideline 308 describes a laboratory test method to assess aerobic and anaerobic transformation of organic chemicals in aquatic sediment systems and is an integral part of tiered testing strategies in different legislative frameworks for the environmental risk assessment of chemicals. The results from experiments carried out according to OECD 308 are generally used to derive persistence indicators for hazard assessment or half-lives for exposure assessment. We used Bayesian parameter estimation and system representations of various complexities to systematically assess opportunities and limitations for estimating these indicators from existing data generated according to OECD 308 for 23 pesticides and pharmaceuticals. We found that there is a disparity between the uncertainty and the conceptual robustness of persistence indicators. Disappearance half-lives are directly extractable with limited uncertainty, but they lump degradation and phase transfer information and are not robust against changes in system geometry. Transformation half-lives are less system-specific but require inverse modeling to extract, resulting in considerable uncertainty. Available data were thus insufficient to derive indicators that had both acceptable robustness and uncertainty, which further supports previously voiced concerns about the usability and efficiency of these costly experiments. Despite the limitations of existing data, we suggest the time until 50% of the parent compound has been transformed in the entire system (DegT(50,system)) could still be a useful indicator of persistence in the upper, partially aerobic sediment layer in the context of PBT assessment. This should, however, be accompanied by a mandatory reporting or full standardization of the geometry of the experimental system. We recommend transformation half-lives determined by inverse modeling to be used as input parameters into fate models for exposure assessment, if due consideration is given to their uncertainty.

  9. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  10. Human Health Risk Assessment of Pharmaceuticals in Water: Issues and Challenges Ahead

    PubMed Central

    Kumar, Arun; Chang, Biao; Xagoraraki, Irene

    2010-01-01

    This study identified existing issues related to quantitative pharmaceutical risk assessment (QPhRA, hereafter) for pharmaceuticals in water and proposed possible solutions by analyzing methodologies and findings of different published QPhRA studies. Retrospective site-specific QPhRA studies from different parts of the world (U.S.A., United Kingdom, Europe, India, etc.) were reviewed in a structured manner to understand different assumptions, outcomes obtained and issues, identified/addressed/raised by the different QPhRA studies. Till date, most of the published studies have concluded that there is no appreciable risk to human health during environmental exposures of pharmaceuticals; however, attention is still required to following identified issues: (1) Use of measured versus predicted pharmaceutical concentration, (2) Identification of pharmaceuticals-of-concern and compounds needing special considerations, (3) Use of source water versus finished drinking water-related exposure scenarios, (4) Selection of representative exposure routes, (5) Valuation of uncertainty factors, and (6) Risk assessment for mixture of chemicals. To close the existing data and methodology gaps, this study proposed possible ways to address and/or incorporation these considerations within the QPhRA framework; however, more research work is still required to address issues, such as incorporation of short-term to long-term extrapolation and mixture effects in the QPhRA framework. Specifically, this study proposed a development of a new “mixture effects-related uncertainty factor” for mixture of chemicals (i.e., mixUFcomposite), similar to an uncertainty factor of a single chemical, within the QPhRA framework. In addition to all five traditionally used uncertainty factors, this uncertainty factor is also proposed to include concentration effects due to presence of different range of concentration levels of pharmaceuticals in a mixture. However, further work is required to determine values of all six uncertainty factors and incorporate them to use during estimation of point-of-departure values within the QPhRA framework. PMID:21139869

  11. Effects of aerodynamic heating and TPS thermal performance uncertainties on the Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Goodrich, W. D.; Derry, S. M.; Maraia, R. J.

    1980-01-01

    A procedure for estimating uncertainties in the aerodynamic-heating and thermal protection system (TPS) thermal-performance methodologies developed for the Shuttle Orbiter is presented. This procedure is used in predicting uncertainty bands around expected or nominal TPS thermal responses for the Orbiter during entry. Individual flowfield and TPS parameters that make major contributions to these uncertainty bands are identified and, by statistical considerations, combined in a manner suitable for making engineering estimates of the TPS thermal confidence intervals and temperature margins relative to design limits. Thus, for a fixed TPS design, entry trajectories for future Orbiter missions can be shaped subject to both the thermal-margin and confidence-interval requirements. This procedure is illustrated by assessing the thermal margins offered by selected areas of the existing Orbiter TPS design for an entry trajectory typifying early flight test missions.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.

    Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less

  13. Using experimental data to reduce the single-building sigma of fragility curves: case study of the BRD tower in Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Perrault, Matthieu; Gueguen, Philippe; Aldea, Alexandru; Demetriu, Sorin

    2013-12-01

    The lack of knowledge concerning modelling existing buildings leads to signifiant variability in fragility curves for single or grouped existing buildings. This study aims to investigate the uncertainties of fragility curves, with special consideration of the single-building sigma. Experimental data and simplified models are applied to the BRD tower in Bucharest, Romania, a RC building with permanent instrumentation. A three-step methodology is applied: (1) adjustment of a linear MDOF model for experimental modal analysis using a Timoshenko beam model and based on Anderson's criteria, (2) computation of the structure's response to a large set of accelerograms simulated by SIMQKE software, considering twelve ground motion parameters as intensity measurements (IM), and (3) construction of the fragility curves by comparing numerical interstory drift with the threshold criteria provided by the Hazus methodology for the slight damage state. By introducing experimental data into the model, uncertainty is reduced to 0.02 considering S d ( f 1) as seismic intensity IM and uncertainty related to the model is assessed at 0.03. These values must be compared with the total uncertainty value of around 0.7 provided by the Hazus methodology.

  14. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  15. Cantilever spring constant calibration using laser Doppler vibrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohler, Benjamin

    2007-06-15

    Uncertainty in cantilever spring constants is a critical issue in atomic force microscopy (AFM) force measurements. Though numerous methods exist for calibrating cantilever spring constants, the accuracy of these methods can be limited by both the physical models themselves as well as uncertainties in their experimental implementation. Here we report the results from two of the most common calibration methods, the thermal tune method and the Sader method. These were implemented on a standard AFM system as well as using laser Doppler vibrometry (LDV). Using LDV eliminates some uncertainties associated with optical lever detection on an AFM. It also offersmore » considerably higher signal to noise deflection measurements. We find that AFM and LDV result in similar uncertainty in the calibrated spring constants, about 5%, using either the thermal tune or Sader methods provided that certain limitations of the methods and instrumentation are observed.« less

  16. A review of uncertainty research in impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We identified three main themes of uncertainty research in 134 papers from the scholarly literature. • The majority of research has focused on better methods for managing uncertainty in predictions. • Uncertainty disclosure is demanded of practitioners, but there is little guidance on how to do so. • There is limited theoretical explanation as to why uncertainty is avoided or not disclosed. • Conceptual, practical and theoretical guidance are required for IA uncertainty consideration.« less

  17. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  18. Uncertainties in the Forecasted Performance of Sediment Diversions Associated with Differences Between "Optimized" Diversion Design Criteria and the Natural Crevasse-Splay Sub-Delta Life-Cycle

    NASA Astrophysics Data System (ADS)

    Brown, G.

    2017-12-01

    Sediment diversions have been proposed as a crucial component of the restoration of Coastal Louisiana. They are generally characterized as a means of creating land by mimicking natural crevasse-splay sub-delta processes. However, the criteria that are often promoted to optimize the performance of these diversions (i.e. large, sand-rich diversions into existing, degraded wetlands) are at odds with the natural processes that govern the development of crevasse-splay sub-deltas (typically sand-lean or sand-neutral diversions into open water). This is due in large part to the fact that these optimization criteria have been developed in the absence of consideration for the natural constraints associated with fundamental hydraulics: specifically, the conservation of mechanical energy. Although the implementation of the aforementioned optimization criteria have the potential to greatly increase the land-building capacity of a given diversion, the concomitant widespread inundation of the existing wetlands (an unavoidable consequence of diverting into a shallow, vegetated embayment), and the resultant stresses on existing wetland vegetation, have the potential to dramatically accelerate the loss of these existing wetlands. Hence, there are inherent uncertainties in the forecasted performance of sediment diversions that are designed according to the criteria mentioned above. This talk details the reasons for these uncertainties, using analytic and numerical model results, together with evidence from field observations and experiments. The likelihood that, in the foreseeable future, these uncertainties can be reduced, or even rationally bounded, is discussed.

  19. Constraining past seawater δ18O and temperature records developed from foraminiferal geochemistry

    NASA Astrophysics Data System (ADS)

    Quinn, T. M.; Thirumalai, K.; Marino, G.

    2016-12-01

    Paired measurements of magnesium-to-calcium ratios (Mg/Ca) and the stable oxygen isotopic composition (δ18O) in foraminifera have significantly advanced our knowledge of the climate system by providing information on past temperature and seawater δ18O (δ18Osw, a proxy for salinity and ice volume). However, multiple sources of uncertainty exist in transferring these downcore geochemical data into quantitative paleoclimate reconstructions. Here, we develop a computational toolkit entitled Paleo-Seawater Uncertainty Solver (PSU Solver) that performs bootstrap Monte Carlo simulations to constrain these various sources of uncertainty. PSU Solver calculates temperature and δ18Osw, and their respective confidence intervals using an iterative approach with user-defined errors, calibrations, and sea-level curves. Our probabilistic approach yields reduced uncertainty constraints compared to theoretical considerations and commonly used propagation exercises. We demonstrate the applicability of PSU Solver for published records covering three timescales: the late Holocene, the last deglaciation, and the last glacial period. We show that the influence of salinity on Mg/Ca can considerably alter the structure and amplitude of change in the resulting reconstruction and can impact the interpretation of paleoceanographic time series. We also highlight the sensitivity of the records to various inputs of sea-level curves, transfer functions, and uncertainty constraints. PSU Solver offers an expeditious yet rigorous approach to test the robustness of past climate variability inferred from paired Mg/Ca-δ18O measurements.

  20. Damage assessment of composite plate structures with material and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  1. Model uncertainty of various settlement estimation methods in shallow tunnels excavation; case study: Qom subway tunnel

    NASA Astrophysics Data System (ADS)

    Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb

    2017-10-01

    In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.

  2. Retrospective estimation of the electric and magnetic field exposure conditions in in vitro experimental reports reveal considerable potential for uncertainty.

    PubMed

    Portelli, Lucas A; Falldorf, Karsten; Thuróczy, György; Cuppen, Jan

    2018-04-01

    Experiments on cell cultures exposed to extremely low frequency (ELF, 3-300 Hz) magnetic fields are often subject to multiple sources of uncertainty associated with specific electric and magnetic field exposure conditions. Here we systemically quantify these uncertainties based on exposure conditions described in a group of bioelectromagnetic experimental reports for a representative sampling of the existing literature. The resulting uncertainties, stemming from insufficient, ambiguous, or erroneous description, design, implementation, or validation of the experimental methods and systems, were often substantial enough to potentially make any successful reproduction of the original experimental conditions difficult or impossible. Without making any assumption about the true biological relevance of ELF electric and magnetic fields, these findings suggest another contributing factor which may add to the overall variability and irreproducibility traditionally associated with experimental results of in vitro exposures to low-level ELF magnetic fields. Bioelectromagnetics. 39:231-243, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Modeling for waste management associated with environmental-impact abatement under uncertainty.

    PubMed

    Li, P; Li, Y P; Huang, G H; Zhang, J L

    2015-04-01

    Municipal solid waste (MSW) treatment can generate significant amounts of pollutants, and thus pose a risk on human health. Besides, in MSW management, various uncertainties exist in the related costs, impact factors, and objectives, which can affect the optimization processes and the decision schemes generated. In this study, a life cycle assessment-based interval-parameter programming (LCA-IPP) method is developed for MSW management associated with environmental-impact abatement under uncertainty. The LCA-IPP can effectively examine the environmental consequences based on a number of environmental impact categories (i.e., greenhouse gas equivalent, acid gas emissions, and respiratory inorganics), through analyzing each life cycle stage and/or major contributing process related to various MSW management activities. It can also tackle uncertainties existed in the related costs, impact factors, and objectives and expressed as interval numbers. Then, the LCA-IPP method is applied to MSW management for the City of Beijing, the capital of China, where energy consumptions and six environmental parameters [i.e., CO2, CO, CH4, NOX, SO2, inhalable particle (PM10)] are used as systematic tool to quantify environmental releases in entire life cycle stage of waste collection, transportation, treatment, and disposal of. Results associated with system cost, environmental impact, and the related policy implication are generated and analyzed. Results can help identify desired alternatives for managing MSW flows, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty.

  4. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    PubMed

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  5. Uncertainty, robustness, and the value of information in managing a population of northern bobwhites

    USGS Publications Warehouse

    Johnson, Fred A.; Hagan, Greg; Palmer, William E.; Kemmerer, Michael

    2014-01-01

    The abundance of northern bobwhites (Colinus virginianus) has decreased throughout their range. Managers often respond by considering improvements in harvest and habitat management practices, but this can be challenging if substantial uncertainty exists concerning the cause(s) of the decline. We were interested in how application of decision science could be used to help managers on a large, public management area in southwestern Florida where the bobwhite is a featured species and where abundance has severely declined. We conducted a workshop with managers and scientists to elicit management objectives, alternative hypotheses concerning population limitation in bobwhites, potential management actions, and predicted management outcomes. Using standard and robust approaches to decision making, we determined that improved water management and perhaps some changes in hunting practices would be expected to produce the best management outcomes in the face of uncertainty about what is limiting bobwhite abundance. We used a criterion called the expected value of perfect information to determine that a robust management strategy may perform nearly as well as an optimal management strategy (i.e., a strategy that is expected to perform best, given the relative importance of different management objectives) with all uncertainty resolved. We used the expected value of partial information to determine that management performance could be increased most by eliminating uncertainty over excessive-harvest and human-disturbance hypotheses. Beyond learning about the factors limiting bobwhites, adoption of a dynamic management strategy, which recognizes temporal changes in resource and environmental conditions, might produce the greatest management benefit. Our research demonstrates that robust approaches to decision making, combined with estimates of the value of information, can offer considerable insight into preferred management approaches when great uncertainty exists about system dynamics and the effects of management.

  6. Information Fusion Issues in the UK Environmental Science Community

    NASA Astrophysics Data System (ADS)

    Giles, J. R.

    2010-12-01

    The Earth is a complex, interacting system which cannot be neatly divided by discipline boundaries. To gain an holistic understanding of even a component of an Earth System requires researchers to draw information from multiple disciplines and integrate these to develop a broader understanding. But the barriers to achieving this are formidable. Research funders attempting to encourage the integration of information across disciplines need to take into account culture issues, the impact of intrusion of projects on existing information systems, ontologies and semantics, scale issues, heterogeneity and the uncertainties associated with combining information from diverse sources. Culture - There is a cultural dualism in the environmental sciences were information sharing is both rewarded and discouraged. Researchers who share information both gain new opportunities and risk reducing their chances of being first author in an high-impact journal. The culture of the environmental science community has to be managed to ensure that information fusion activities are encouraged. Intrusion - Existing information systems have an inertia of there own because of the intellectual and financial capital invested within them. Information fusion activities must recognise and seek to minimise the potential impact of their projects on existing systems. Low intrusion information fusions systems such as OGC web-service and the OpenMI Standard are to be preferred to whole-sale replacement of existing systems. Ontology and Semantics - Linking information across disciplines requires a clear understanding of the concepts deployed in the vocabulary used to describe them. Such work is a critical first step to creating routine information fusion. It is essential that national bodies, such as geological surveys organisations, document and publish their ontologies, semantics, etc. Scale - Environmental processes operate at scales ranging from microns to the scale of the Solar System and potentially beyond. The many different scales involved provide serious challenges to information fusion which need to be researched. Heterogeneity - Natural systems are heterogeneous, that is a system consisting of multiple components each of which may have considerable internal variation. Modelling Earth Systems requires recognition of the inherent complexity. Uncertainty - Understanding the uncertainties within a single information source can be difficult. Understanding the uncertainties across a system of linked models, each drawn from multiple information resources, represents a considerable challenge that must be addressed. The challenges to overcome appear insurmountable to individual research groups; but the potential rewards, in terms of a fuller scientific understanding of Earth Systems, are significant. A major international effort must be mounted to tackle these barriers and enable routine information fusion.

  7. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).

  8. BAUM: improving genome assembly by adaptive unique mapping and local overlap-layout-consensus approach.

    PubMed

    Wang, Anqi; Wang, Zhanyu; Li, Zheng; Li, Lei M

    2018-06-15

    It is highly desirable to assemble genomes of high continuity and consistency at low cost. The current bottleneck of draft genome continuity using the second generation sequencing (SGS) reads is primarily caused by uncertainty among repetitive sequences. Even though the single-molecule real-time sequencing technology is very promising to overcome the uncertainty issue, its relatively high cost and error rate add burden on budget or computation. Many long-read assemblers take the overlap-layout-consensus (OLC) paradigm, which is less sensitive to sequencing errors, heterozygosity and variability of coverage. However, current assemblers of SGS data do not sufficiently take advantage of the OLC approach. Aiming at minimizing uncertainty, the proposed method BAUM, breaks the whole genome into regions by adaptive unique mapping; then the local OLC is used to assemble each region in parallel. BAUM can (i) perform reference-assisted assembly based on the genome of a close species (ii) or improve the results of existing assemblies that are obtained based on short or long sequencing reads. The tests on two eukaryote genomes, a wild rice Oryza longistaminata and a parrot Melopsittacus undulatus, show that BAUM achieved substantial improvement on genome size and continuity. Besides, BAUM reconstructed a considerable amount of repetitive regions that failed to be assembled by existing short read assemblers. We also propose statistical approaches to control the uncertainty in different steps of BAUM. http://www.zhanyuwang.xin/wordpress/index.php/2017/07/21/baum. Supplementary data are available at Bioinformatics online.

  9. Unnecessary surgery.

    PubMed Central

    Leape, L L

    1989-01-01

    The extent of unnecessary surgery has been the object of considerable speculation and occasional wild accusation in recent years. Most evidence of the existence of unnecessary surgery, such as information from studies of geographic variations and the results of second surgical opinion programs, is circumstantial. However, results from the few studies that have measured unnecessary surgery directly indicate that for some highly controversial operations the fraction that are unwarranted could be as high as 30 percent. Most unnecessary surgery results from physician uncertainty about the effectiveness of an operation. Elimination of this uncertainty requires more efficient production and dissemination of scientific information about clinical effectiveness. In the absence of adequate data from scientific studies, the use of a consensus of expert opinion, disseminated by means of comprehensive practice guidelines, offers the best opportunity to identify and eliminate unnecessary surgery. PMID:2668237

  10. The Influence of Boundary Layer Parameters on Interior Noise

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Rocha, Joana

    2012-01-01

    Predictions of the wall pressure in the turbulent boundary of an aerospace vehicle can differ substantially from measurement due to phenomena that are not well understood. Characterizing the phenomena will require additional testing at considerable cost. Before expending scarce resources, it is desired to quantify the effect of the uncertainty in wall pressure predictions and measurements on structural response and acoustic radiation. A sensitivity analysis is performed on four parameters of the Corcos cross spectrum model: power spectrum, streamwise and cross stream coherence lengths and Mach number. It is found that at lower frequencies where high power levels and long coherence lengths exist, the radiated sound power prediction has up to 7 dB of uncertainty in power spectrum levels with streamwise and cross stream coherence lengths contributing equally to the total.

  11. The development and application of multi-criteria decision-making tool with consideration of uncertainty: the selection of a management strategy for the bio-degradable fraction in the municipal solid waste.

    PubMed

    El Hanandeh, Ali; El-Zein, Abbas

    2010-01-01

    A modified version of the multi-criteria decision aid, ELECTRE III has been developed to account for uncertainty in criteria weightings and threshold values. The new procedure, called ELECTRE-SS, modifies the exploitation phase in ELECTRE III, through a new definition of the pre-order and the introduction of a ranking index (RI). The new approach accommodates cases where incomplete or uncertain preference data are present. The method is applied to a case of selecting a management strategy for the bio-degradable fraction in the municipal solid waste of Sydney. Ten alternatives are compared against 11 criteria. The results show that anaerobic digestion (AD) and composting of paper are less environmentally sound options than recycling. AD is likely to out-perform incineration where a market for heating does not exist. Moreover, landfilling can be a sound alternative, when considering overall performance and conditions of uncertainty.

  12. Health and productivity gains from better indoor environments and their relationship with building energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, William J.

    2000-04-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of communicable respiratory illness, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in the estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the U.S., the estimated potential annual savings and productivity gains are $6 to $14 billionmore » from reduced respiratory disease, $2 to $4 billion from reduced allergies and asthma, $10 to $30 billion from reduced sick building syndrome symptoms, and $20 to $160 billion from direct improvements in worker performance that are unrelated to health. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  13. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    DTIC Science & Technology

    2017-11-27

    ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98)    Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army

  14. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    NASA Astrophysics Data System (ADS)

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  15. Apparatus and methodology for fire gas characterization by means of animal exposure

    NASA Technical Reports Server (NTRS)

    Marcussen, W. H.; Hilado, C. J.; Furst, A.; Leon, H. A.; Kourtides, D. A.; Parker, J. A.; Butte, J. C.; Cummins, J. M.

    1976-01-01

    While there is a great deal of information available from small-scale laboratory experiments and for relatively simple mixtures of gases, considerable uncertainty exists regarding appropriate bioassay techniques for the complex mixture of gases generated in full-scale fires. Apparatus and methodology have been developed based on current state of the art for determining the effects of fire gases in the critical first 10 minutes of a full-scale fire on laboratory animals. This information is presented for its potential value and use while further improvements are being made.

  16. Choice of generic antihypertensive drugs for the primary prevention of cardiovascular disease--a cost-effectiveness analysis.

    PubMed

    Wisløff, Torbjørn; Selmer, Randi M; Halvorsen, Sigrun; Fretheim, Atle; Norheim, Ole F; Kristiansen, Ivar Sønbø

    2012-04-04

    Hypertension is one of the leading causes of cardiovascular disease (CVD). A range of antihypertensive drugs exists, and their prices vary widely mainly due to patent rights. The objective of this study was to explore the cost-effectiveness of different generic antihypertensive drugs as first, second and third choice for primary prevention of cardiovascular disease. We used the Norwegian Cardiovascular Disease model (NorCaD) to simulate the cardiovascular life of patients from hypertension without symptoms until they were all dead or 100 years old. The risk of CVD events and costs were based on recent Norwegian sources. In single-drug treatment, all antihypertensives are cost-effective compared to no drug treatment. In the base-case analysis, the first, second and third choice of antihypertensive were calcium channel blocker, thiazide and angiotensin-converting enzyme inhibitor. However the sensitivity and scenario analyses indicated considerable uncertainty in that angiotensin receptor blockers as well as, angiotensin-converting enzyme inhibitors, beta blockers and thiazides could be the most cost-effective antihypertensive drugs. Generic antihypertensives are cost-effective in a wide range of risk groups. There is considerable uncertainty, however, regarding which drug is the most cost-effective.

  17. Assessing uncertain human exposure to ambient air pollution using environmental models in the Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Pebesma, E.; Denby, B.

    2012-04-01

    Ambient air quality can have significant impact on human health by causing respiratory and cardio-vascular diseases. Thereby, the pollutant concentration a person is exposed to can differ considerably between individuals depending on their daily routine and movement patterns. Using a straight forward approach this exposure can be estimated by integration of individual space-time paths and spatio-temporally resolved ambient air quality data. To allow a realistic exposure assessment, it is furthermore important to consider uncertainties due to input and model errors. In this work, we present a generic, web-based approach for estimating individual exposure by integration of uncertain position and air quality information implemented as a web service. Following the Model Web initiative envisioning an infrastructure for deploying, executing and chaining environmental models as services, existing models and data sources for e.g. air quality, can be used to assess exposure. Therefore, the service needs to deal with different formats, resolutions and uncertainty representations provided by model or data services. Potential mismatch can be accounted for by transformation of uncertainties and (dis-)aggregation of data under consideration of changes in the uncertainties using components developed in the UncertWeb project. In UncertWeb, the Model Web vision is extended to an Uncertainty-enabled Model Web, where services can process and communicate uncertainties in the data and models. The propagation of uncertainty to the exposure results is quantified using Monte Carlo simulation by combining different realisations of positions and ambient concentrations. Two case studies were used to evaluate the developed exposure assessment service. In a first study, GPS tracks with a positional uncertainty of a few meters, collected in the urban area of Münster, Germany were used to assess exposure to PM10 (particulate matter smaller 10 µm). Air quality data was provided by an uncertainty-enabled air quality model system which provided realisations of concentrations per hour on a 250 m x 250 m resolved grid over Münster. The second case study uses modelled human trajectories in Rotterdam, The Netherlands. The trajectories were provided as realisations in 15 min resolution per 4 digit postal code from an activity model. Air quality estimates were provided for different pollutants as ensembles by a coupled meteorology and air quality model system on a 1 km x 1 km grid with hourly resolution. Both case studies show the successful application of the service to different resolutions and uncertainty representations.

  18. Management of "dual diagnosis" patients : consensus, controversies and considerations.

    PubMed

    Basu, D; Gupta, N

    2000-01-01

    The term 'dual diagnosis' denotes the coexistence of substance use disorder(s) and other, non-substance-use, psychiatric disorder(s). The last two decades, and especially the 1990s, have witnessed tremendous research and clinical interest in this previously neglected area. India, however, lags behind, inspite of indications that the problem exists here too. The current approach to managing such patients is the 'integrated treatment model' in which the same clinician (or team of clinicians) provides treatment for both the disorders at the same time, treating both with equal understanding and importance. Both pharmacotherapy as well as psychosocial therapies are specifically designed keeping in mind the 'integrated' philosophy of treatment. The specific principles and components are described Areas of difficulty, uncertainty, and future considerations are highlighted, with a note on the Indian setting.

  19. Reliability considerations for the total strain range version of strainrange partitioning

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Wu, Y. T.

    1984-01-01

    A proposed total strainrange version of strainrange partitioning (SRP) to enhance the manner in which SRP is applied to life prediction is considered with emphasis on how advanced reliability technology can be applied to perform risk analysis and to derive safety check expressions. Uncertainties existing in the design factors associated with life prediction of a component which experiences the combined effects of creep and fatigue can be identified. Examples illustrate how reliability analyses of such a component can be performed when all design factors in the SRP model are random variables reflecting these uncertainties. The Rackwitz-Fiessler and Wu algorithms are used and estimates of the safety index and the probablity of failure are demonstrated for a SRP problem. Methods of analysis of creep-fatigue data with emphasis on procedures for producing synoptic statistics are presented. An attempt to demonstrate the importance of the contribution of the uncertainties associated with small sample sizes (fatique data) to risk estimates is discussed. The procedure for deriving a safety check expression for possible use in a design criteria document is presented.

  20. Legal considerations in infectious diseases and dentistry.

    PubMed

    Burris, S

    1996-04-01

    Dentists, similar to other professionals subject to legal regulation, often have an overly simple view of the legal system. Communicable diseases present questions on the cutting edge of the law, and, as the previous discussion makes perhaps painfully clear, there is considerable uncertainty on many important legal points. Legal uncertainty is often a reflection of social or scientific uncertainty. Clear answers emerge less from the words of lawyers and judges than from the actions of professionals themselves, who ultimately set the standard of care. In any area of legal uncertainty, the dentist is best advised to adhere to the best scientific information available and to meet the ethical standards of the profession.

  1. Harmonization of reimbursement and regulatory approval processes: a systematic review of international experiences.

    PubMed

    Tsoi, Bernice; Masucci, Lisa; Campbell, Kaitryn; Drummond, Michael; O'Reilly, Daria; Goeree, Ron

    2013-08-01

    A considerable degree of overlap exists between reimbursement and regulatory approval of health technologies, and harmonization of certain aspects is both possible and feasible. Various models to harmonization have been suggested in which a number of practical attempts have been drawn from. Based on a review of the literature, approaches can be categorized into those focused on reducing uncertainty and developing economies of scale in the evidentiary requirements; and/or aligning timeframes and logistical aspects of the review process. These strategies can further be classified based on the expected level of structural and organizational change required to implement them into the existing processes. Passive processes require less modification, whereas active processes are associated with greater restructuring. Attempts so far at harmonization have raised numerous legal and practical issues and these must be considered when introducing a more harmonized framework into the existing regulatory and reimbursement arrangements.

  2. Communicating the Signal of Climate Change in The Presence of Non-Random Noise

    NASA Astrophysics Data System (ADS)

    Mann, M. E.

    2015-12-01

    The late Stephen Schneider spoke eloquently of the double ethical bind that we face: we must strive to communicate effectively but honestly. This is no simple task given the considerable "noise" generated in our public discourse by vested interests instead working to misinform the public. To do so, we must convey what is known in plainspoken jargon-free language, while acknowledging the real uncertainties that exist. Further, we must explain the implications of those uncertainties, which in many cases imply the possibility of greater, not lesser, risk. Finally, we must not be averse to discussing the policy implications of the science, lest we fail to provide our audience with critical information that can help them make informed choices about their own actions as citizens. I will use examples from my current collaboration with Washington Post editorial cartoonist Tom Toles.

  3. Potential Nationwide Improvements in Productivity and Health from Better Indoor Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, W.J.; Rosenfeld, A.H.

    1998-05-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of respiratory disease, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in our estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the U.S., we estimate potential annual savings and productivity gains of $6 to $19 billion frommore » reduced respiratory disease, $1 to $4 billion from reduced allergies and asthma, $10 to $20 billion from reduced sick building syndrome symptoms, and $12 to $125 billion from direct improvements in worker performance that are unrelated to health. In two example calculations, the potential financial benefits of improving indoor environments exceed costs by a factor of 8 and 14. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  4. Potential nationwide improvements in productivity and health from better indoor environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, W.J.; Rosenfeld, A.H.

    1998-07-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of respiratory disease, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in their estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the US, the authors estimate potential annual savings and productivity gains of $6 to $19 billionmore » from reduced respiratory disease, $1 to $4 billion from reduced allergies and asthma, $10 to $20 billion from reduced sick building syndrome symptoms, and $12 to $125 billion from direct improvements in worker performance that are unrelated to health. In two example calculations, the potential financial benefits of improving indoor environments exceed costs by a factor of 8 and 14. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  5. Fitting Formulae and Constraints for the Existence of S-type and P-type Habitable Zones in Binary Systems

    NASA Astrophysics Data System (ADS)

    Wang, Zhaopeng; Cuntz, Manfred

    2017-10-01

    We derive fitting formulae for the quick determination of the existence of S-type and P-type habitable zones (HZs) in binary systems. Based on previous work, we consider the limits of the climatological HZ in binary systems (which sensitively depend on the system parameters) based on a joint constraint encompassing planetary orbital stability and a habitable region for a possible system planet. Additionally, we employ updated results on planetary climate models obtained by Kopparapu and collaborators. Our results are applied to four P-type systems (Kepler-34, Kepler-35, Kepler-413, and Kepler-1647) and two S-type systems (TrES-2 and KOI-1257). Our method allows us to gauge the existence of climatological HZs for these systems in a straightforward manner with detailed consideration of the observational uncertainties. Further applications may include studies of other existing systems as well as systems to be identified through future observational campaigns.

  6. Fitting Formulae and Constraints for the Existence of S-type and P-type Habitable Zones in Binary Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Zhaopeng; Cuntz, Manfred, E-mail: zhaopeng.wang@mavs.uta.edu, E-mail: cuntz@uta.edu

    We derive fitting formulae for the quick determination of the existence of S-type and P-type habitable zones (HZs) in binary systems. Based on previous work, we consider the limits of the climatological HZ in binary systems (which sensitively depend on the system parameters) based on a joint constraint encompassing planetary orbital stability and a habitable region for a possible system planet. Additionally, we employ updated results on planetary climate models obtained by Kopparapu and collaborators. Our results are applied to four P-type systems (Kepler-34, Kepler-35, Kepler-413, and Kepler-1647) and two S-type systems (TrES-2 and KOI-1257). Our method allows us tomore » gauge the existence of climatological HZs for these systems in a straightforward manner with detailed consideration of the observational uncertainties. Further applications may include studies of other existing systems as well as systems to be identified through future observational campaigns.« less

  7. An Approximation Solution to Refinery Crude Oil Scheduling Problem with Demand Uncertainty Using Joint Constrained Programming

    PubMed Central

    Duan, Qianqian; Yang, Genke; Xu, Guanglin; Pan, Changchun

    2014-01-01

    This paper is devoted to develop an approximation method for scheduling refinery crude oil operations by taking into consideration the demand uncertainty. In the stochastic model the demand uncertainty is modeled as random variables which follow a joint multivariate distribution with a specific correlation structure. Compared to deterministic models in existing works, the stochastic model can be more practical for optimizing crude oil operations. Using joint chance constraints, the demand uncertainty is treated by specifying proximity level on the satisfaction of product demands. However, the joint chance constraints usually hold strong nonlinearity and consequently, it is still hard to handle it directly. In this paper, an approximation method combines a relax-and-tight technique to approximately transform the joint chance constraints to a serial of parameterized linear constraints so that the complicated problem can be attacked iteratively. The basic idea behind this approach is to approximate, as much as possible, nonlinear constraints by a lot of easily handled linear constraints which will lead to a well balance between the problem complexity and tractability. Case studies are conducted to demonstrate the proposed methods. Results show that the operation cost can be reduced effectively compared with the case without considering the demand correlation. PMID:24757433

  8. An approximation solution to refinery crude oil scheduling problem with demand uncertainty using joint constrained programming.

    PubMed

    Duan, Qianqian; Yang, Genke; Xu, Guanglin; Pan, Changchun

    2014-01-01

    This paper is devoted to develop an approximation method for scheduling refinery crude oil operations by taking into consideration the demand uncertainty. In the stochastic model the demand uncertainty is modeled as random variables which follow a joint multivariate distribution with a specific correlation structure. Compared to deterministic models in existing works, the stochastic model can be more practical for optimizing crude oil operations. Using joint chance constraints, the demand uncertainty is treated by specifying proximity level on the satisfaction of product demands. However, the joint chance constraints usually hold strong nonlinearity and consequently, it is still hard to handle it directly. In this paper, an approximation method combines a relax-and-tight technique to approximately transform the joint chance constraints to a serial of parameterized linear constraints so that the complicated problem can be attacked iteratively. The basic idea behind this approach is to approximate, as much as possible, nonlinear constraints by a lot of easily handled linear constraints which will lead to a well balance between the problem complexity and tractability. Case studies are conducted to demonstrate the proposed methods. Results show that the operation cost can be reduced effectively compared with the case without considering the demand correlation.

  9. A kriging metamodel-assisted robust optimization method based on a reverse model

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  10. Considering Governance for Patient Access to E-Medical Records.

    PubMed

    Day, Karen; Wells, Susan

    2015-01-01

    People having access to their medical records could have a transformative improvement effect on healthcare delivery and use. Our research aimed to explore the concerns and attitudes of giving people electronic access to their medical records through patient portals. We conducted 28 semi-structured interviews with 30 people, asking questions about portal design, organisational implications and governance. We report the findings of the governance considerations raised during the interviews. These revealed that (1) there is uncertainty about the possible design and extent of giving people access to their medical records to view/use, (2) existing policies about patient authentication, proxy, and privacy require modification, and (3) existing governance structures and functions require further examination and adjustment. Future research should include more input from patients and health informaticians.

  11. Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, C.; Gupta, P.C.

    1995-05-01

    Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.

  12. Sources of uncertainty in annual forest inventory estimates

    Treesearch

    Ronald E. McRoberts

    2000-01-01

    Although design and estimation aspects of annual forest inventories have begun to receive considerable attention within the forestry and natural resources communities, little attention has been devoted to identifying the sources of uncertainty inherent in these systems or to assessing the impact of those uncertainties on the total uncertainties of inventory estimates....

  13. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  14. On the Meaning of Feedback Parameter, Transient Climate Response, and the Greenhouse Effect: Basic Considerations and the Discussion of Uncertainties

    NASA Astrophysics Data System (ADS)

    Kramm, Gerhard

    2010-07-01

    In this paper we discuss the meaning of feedback parameter, greenhouse effect and transient climate response usually related to the globally averaged energy balance model of Schneider and Mass. After scrutinizing this model and the corresponding planetary radiation balance we state that (a) the this globally averaged energy balance model is flawed by unsuitable physical considerations, (b) the planetary radiation balance for an Earth in the absence of an atmosphere is fraught by the inappropriate assumption of a uniform surface temperature, the so-called radiative equilibrium temperature of about 255 K, and (c) the effect of the radiative anthropogenic forcing, considered as a perturbation to the natural system, is much smaller than the uncertainty involved in the solution of the model of Schneider and Mass. This uncertainty is mainly related to the empirical constants suggested by various authors and used for predicting the emission of infrared radiation by the Earth's skin. Furthermore, after inserting the absorption of solar radiation by atmospheric constituents and the exchange of sensible and latent heat between the Earth and the atmosphere into the model of Schneider and Mass the surface temperatures become appreciably lesser than the radiative equilibrium temperature. Moreover, neither the model of Schneider and Mass nor the Dines-type two-layer energy balance model for the Earth-atmosphere system, both contain the planetary radiation balance for an Earth in the absence of an atmosphere as an asymptotic solution, do not provide evidence for the existence of the so-called atmospheric greenhouse effect if realistic empirical data are used.

  15. Uncertainties in selected river water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2007-02-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  16. Uncertainties in selected surface water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2006-09-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  17. Assessing climate change and socio-economic uncertainties in long term management of water resources

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.

  18. Cost-effective conservation of an endangered frog under uncertainty.

    PubMed

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost-effective conservation under uncertainty. © 2015 Society for Conservation Biology.

  19. Geologic uncertainty in a regulatory environment: An example from the potential Yucca Mountain nuclear waste repository site

    NASA Astrophysics Data System (ADS)

    Rautman, C. A.; Treadway, A. H.

    1991-11-01

    Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.

  20. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  1. The costs of future polio risk management policies.

    PubMed

    Tebbens, Radboud J Duintjer; Sangrujee, Nalinee; Thompson, Kimberly M

    2006-12-01

    Decisionmakers need information about the anticipated future costs of maintaining polio eradication as a function of the policy options under consideration. Given the large portfolio of options, we reviewed and synthesized the existing cost data relevant to current policies to provide context for future policies. We model the expected future costs of different strategies for continued vaccination, surveillance, and other costs that require significant potential resource commitments. We estimate the costs of different potential policy portfolios for low-, middle-, and high-income countries to demonstrate the variability in these costs. We estimate that a global transition from routine immunization with oral poliovirus vaccine (OPV) to inactivated poliovirus vaccine (IPV) would increase the costs of managing polio globally, although routine IPV use remains less costly than routine OPV use with supplemental immunization activities. The costs of surveillance and a stockpile, while small compared to routine vaccination costs, represent important expenditures to ensure adequate response to potential outbreaks. The uncertainty and sensitivity analyses highlight important uncertainty in the aggregated costs and demonstrates that the discount rate and uncertainty in price and administration cost of IPV drives the expected incremental cost of routine IPV vs. OPV immunization.

  2. A bi-objective model for robust yard allocation scheduling for outbound containers

    NASA Astrophysics Data System (ADS)

    Liu, Changchun; Zhang, Canrong; Zheng, Li

    2017-01-01

    This article examines the yard allocation problem for outbound containers, with consideration of uncertainty factors, mainly including the arrival and operation time of calling vessels. Based on the time buffer inserting method, a bi-objective model is constructed to minimize the total operational cost and to maximize the robustness of fighting against the uncertainty. Due to the NP-hardness of the constructed model, a two-stage heuristic is developed to solve the problem. In the first stage, initial solutions are obtained by a greedy algorithm that looks n-steps ahead with the uncertainty factors set as their respective expected values; in the second stage, based on the solutions obtained in the first stage and with consideration of uncertainty factors, a neighbourhood search heuristic is employed to generate robust solutions that can fight better against the fluctuation of uncertainty factors. Finally, extensive numerical experiments are conducted to test the performance of the proposed method.

  3. Medical retirement from sport after concussions

    PubMed Central

    Davis-Hayes, Cecilia; Baker, David R.; Bottiglieri, Thomas S.; Levine, William N.; Desai, Natasha; Gossett, James D.

    2018-01-01

    Purpose of review In patients with a considerable history of sports-related concussion, the decision of when to discontinue participation in sports due to medical concerns including neurologic disorders has potentially life-altering consequences, especially for young athletes, and merits a comprehensive evaluation involving nuanced discussion. Few resources exist to aid the sports medicine provider. Recent findings In this narrative review, we describe 10 prototypical vignettes based upon the authors' collective experience in concussion management and propose an algorithm to help clinicians navigate retirement discussions. Issues for consideration include absolute and relative contraindications to return to sport, ranging from clinical or radiographic evidence of lasting neurologic injury to prolonged concussion recovery periods or reduced injury threshold to patient-centered factors including personal identity through sport, financial motivations, and navigating uncertainty in the context of long-term risks. Summary The authors propose a novel treatment algorithm based on real patient cases to guide medical retirement decisions after concussion in sport. PMID:29517059

  4. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  5. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview.

    PubMed

    Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert

    2017-10-01

    Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  6. Methodological uncertainty in quantitative prediction of human hepatic clearance from in vitro experimental systems.

    PubMed

    Hallifax, D; Houston, J B

    2009-03-01

    Mechanistic prediction of unbound drug clearance from human hepatic microsomes and hepatocytes correlates with in vivo clearance but is both systematically low (10 - 20 % of in vivo clearance) and highly variable, based on detailed assessments of published studies. Metabolic capacity (Vmax) of commercially available human hepatic microsomes and cryopreserved hepatocytes is log-normally distributed within wide (30 - 150-fold) ranges; Km is also log-normally distributed and effectively independent of Vmax, implying considerable variability in intrinsic clearance. Despite wide overlap, average capacity is 2 - 20-fold (dependent on P450 enzyme) greater in microsomes than hepatocytes, when both are normalised (scaled to whole liver). The in vitro ranges contrast with relatively narrow ranges of clearance among clinical studies. The high in vitro variation probably reflects unresolved phenotypical variability among liver donors and practicalities in processing of human liver into in vitro systems. A significant contribution from the latter is supported by evidence of low reproducibility (several fold) of activity in cryopreserved hepatocytes and microsomes prepared from the same cells, between separate occasions of thawing of cells from the same liver. The large uncertainty which exists in human hepatic in vitro systems appears to dominate the overall uncertainty of in vitro-in vivo extrapolation, including uncertainties within scaling, modelling and drug dependent effects. As such, any notion of quantitative prediction of clearance appears severely challenged.

  7. Comparison of model propeller tests with airfoil theory

    NASA Technical Reports Server (NTRS)

    Durand, William F; Lesley, E P

    1925-01-01

    The purpose of the investigation covered by this report was the examination of the degree of approach which may be anticipated between laboratory tests on model airplane propellers and results computed by the airfoil theory, based on tests of airfoils representative of successive blade sections. It is known that the corrections of angles of attack and for aspect ratio, speed, and interference rest either on experimental data or on somewhat uncertain theoretical assumptions. The general situation as regards these four sets of corrections is far from satisfactory, and while it is recognized that occasion exists for the consideration of such corrections, their determination in any given case is a matter of considerable uncertainty. There exists at the present time no theory generally accepted and sufficiently comprehensive to indicate the amount of such corrections, and the application to individual cases of the experimental data available is, at best, uncertain. While the results of this first phase of the investigation are less positive than had been hoped might be the case, the establishment of the general degree of approach between the two sets of results which might be anticipated on the basis of this simpler mode of application seems to have been desirable.

  8. Section summary: Uncertainty and design considerations

    Treesearch

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  9. Remediation of Groundwater Contaminated by Nuclear Waste

    NASA Astrophysics Data System (ADS)

    Parker, Jack; Palumbo, Anthony

    2008-07-01

    A Workshop on Accelerating Development of Practical Field-Scale Bioremediation Models; An Online Meeting, 23 January to 20 February 2008; A Web-based workshop sponsored by the U.S. Department of Energy Environmental Remediation Sciences Program (DOE/ERSP) was organized in early 2008 to assess the state of the science and knowledge gaps associated with the use of computer models to facilitate remediation of groundwater contaminated by wastes from Cold War era nuclear weapons development and production. Microbially mediated biological reactions offer a potentially efficient means to treat these sites, but considerable uncertainty exists in the coupled biological, chemical, and physical processes and their mathematical representation.

  10. Diffusion of cancer education into schools.

    PubMed

    Anderson, D M; Portnoy, B

    1989-05-01

    Though implementing health education in schools has numerous advantages, many barriers impede adoption of effective curricula. Diffusion theory provides a framework for understanding why some school systems do not implement comprehensive, sequential, and behaviorally oriented curricula. Reasons for failure to provide optimal health education include competition for time, unawareness of resources and research results, political considerations, long-range planning difficulties, lack of teacher training, lack of testing, other academic priorities, uncertainty about responsibility for health education, and costs. Health educators can overcome such difficulties by working within existing curricular agendas, political interests, and budgets, and by organizing interventions through school health councils, while recognizing local programs, conditions, and resources.

  11. Decorrelated jet substructure tagging using adversarial neural networks

    NASA Astrophysics Data System (ADS)

    Shimmin, Chase; Sadowski, Peter; Baldi, Pierre; Weik, Edison; Whiteson, Daniel; Goul, Edward; Søgaard, Andreas

    2017-10-01

    We describe a strategy for constructing a neural network jet substructure tagger which powerfully discriminates boosted decay signals while remaining largely uncorrelated with the jet mass. This reduces the impact of systematic uncertainties in background modeling while enhancing signal purity, resulting in improved discovery significance relative to existing taggers. The network is trained using an adversarial strategy, resulting in a tagger that learns to balance classification accuracy with decorrelation. As a benchmark scenario, we consider the case where large-radius jets originating from a boosted resonance decay are discriminated from a background of nonresonant quark and gluon jets. We show that in the presence of systematic uncertainties on the background rate, our adversarially trained, decorrelated tagger considerably outperforms a conventionally trained neural network, despite having a slightly worse signal-background separation power. We generalize the adversarial training technique to include a parametric dependence on the signal hypothesis, training a single network that provides optimized, interpolatable decorrelated jet tagging across a continuous range of hypothetical resonance masses, after training on discrete choices of the signal mass.

  12. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Dyk, J; Palta, J; Bortfeld, T

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “howmore » do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.« less

  13. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  14. Predicting Response to Reassurances and Uncertainties in Bioterrorism Communications for Urban Populations in New York and California

    PubMed Central

    Vaughan, Elaine; Tinker, Tim L.; Truman, Benedict I.; Edelson, Paul; Morse, Stephen S.

    2015-01-01

    Recent national plans for recovery from bioterrorism acts perpetrated in densely populated urban areas acknowledge the formidable technical and social challenges of consequence management. Effective risk and crisis communication is one priority to strengthen the U.S.’s response and resilience. However, several notable risk events since September 11, 2001, have revealed vulnerabilities in risk/crisis communication strategies and infrastructure of agencies responsible for protecting civilian populations. During recovery from a significant biocontamination event, 2 goals are essential: (1) effective communication of changing risk circumstances and uncertainties related to cleanup, restoration, and reoccupancy; and (2) adequate responsiveness to emerging information needs and priorities of diverse populations in high-threat, vulnerable locations. This telephone survey study explored predictors of public reactions to uncertainty communications and reassurances from leaders related to the remediation stage of an urban-based bioterrorism incident. African American and Hispanic adults (N = 320) were randomly sampled from 2 ethnically and socioeconomically diverse geographic areas in New York and California assessed as high threat, high vulnerability for terrorism and other public health emergencies. Results suggest that considerable heterogeneity exists in risk perspectives and information needs within certain sociodemographic groups; that success of risk/crisis communication during recovery is likely to be uneven; that common assumptions about public responsiveness to particular risk communications need further consideration; and that communication effectiveness depends partly on preexisting values and risk perceptions and prior trust in leaders. Needed improvements in communication strategies are possible with recognition of where individuals start as a reference point for reasoning about risk information, and comprehension of how this influences subsequent interpretation of agencies’ actions and communications. PMID:22582813

  15. Predicting response to reassurances and uncertainties in bioterrorism communications for urban populations in New York and California.

    PubMed

    Vaughan, Elaine; Tinker, Tim L; Truman, Benedict I; Edelson, Paul; Morse, Stephen S

    2012-06-01

    Recent national plans for recovery from bioterrorism acts perpetrated in densely populated urban areas acknowledge the formidable technical and social challenges of consequence management. Effective risk and crisis communication is one priority to strengthen the U.S.'s response and resilience. However, several notable risk events since September 11, 2001, have revealed vulnerabilities in risk/crisis communication strategies and infrastructure of agencies responsible for protecting civilian populations. During recovery from a significant biocontamination event, 2 goals are essential: (1) effective communication of changing risk circumstances and uncertainties related to cleanup, restoration, and reoccupancy; and (2) adequate responsiveness to emerging information needs and priorities of diverse populations in high-threat, vulnerable locations. This telephone survey study explored predictors of public reactions to uncertainty communications and reassurances from leaders related to the remediation stage of an urban-based bioterrorism incident. African American and Hispanic adults (N=320) were randomly sampled from 2 ethnically and socioeconomically diverse geographic areas in New York and California assessed as high threat, high vulnerability for terrorism and other public health emergencies. Results suggest that considerable heterogeneity exists in risk perspectives and information needs within certain sociodemographic groups; that success of risk/crisis communication during recovery is likely to be uneven; that common assumptions about public responsiveness to particular risk communications need further consideration; and that communication effectiveness depends partly on preexisting values and risk perceptions and prior trust in leaders. Needed improvements in communication strategies are possible with recognition of where individuals start as a reference point for reasoning about risk information, and comprehension of how this influences subsequent interpretation of agencies' actions and communications.

  16. Evaluating release alternatives for a long-lived bird species under uncertainty about long-term demographic rates

    USGS Publications Warehouse

    Moore, Clinton T.; Converse, Sarah J.; Folk, Martin J.; Runge, Michael C.; Nesbitt, Stephen A.

    2012-01-01

    The release of animals to reestablish an extirpated population is a decision problem that is often attended by considerable uncertainty about the probability of success. Annual releases of captive-reared juvenile Whooping Cranes (Grus americana) were begun in 1993 in central Florida, USA, to establish a breeding, non-migratory population. Over a 12-year period, 286 birds were released, but by 2004, the introduced flock had produced only four wild-fledged birds. Consequently, releases were halted over managers' concerns about the performance of the released flock and uncertainty about the efficacy of further releases. We used data on marked, released birds to develop predictive models for addressing whether releases should be resumed, and if so, under what schedule. To examine the outcome of different release scenarios, we simulated the survival and productivity of individual female birds under a baseline model that recognized age and breeding-class structure and which incorporated empirically estimated stochastic elements. As data on wild-fledged birds from captive-reared parents were sparse, a key uncertainty that confronts release decision-making is whether captive-reared birds and their offspring share the same vital rates. Therefore, we used data on the only population of wild Whooping Cranes in existence to construct two alternatives to the baseline model. The probability of population persistence was highly sensitive to the choice of these three models. Under the baseline model, extirpation of the population was nearly certain under any scenario of resumed releases. In contrast, the model based on estimates from wild birds projected a high probability of persistence under any release scenario, including cessation of releases. Therefore, belief in either of these models suggests that further releases are an ineffective use of resources. In the third model, which simulated a population Allee effect, population persistence was sensitive to the release decision: high persistence probability was achieved only through the release of more birds, whereas extirpation was highly probable with cessation of releases. Despite substantial investment of time and effort in the release program, evidence collected to date does not favor one model over another; therefore, any decision about further releases must be made under considerable biological uncertainty. However, given an assignment of credibility weight to each model, a best, informed decision about releases can be made under uncertainty. Furthermore, if managers can periodically revisit the release decision and collect monitoring data to further inform the models, then managers have a basis for confronting uncertainty and adaptively managing releases through time.

  17. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts.

    PubMed

    Vilhelmsen, Troels N; Ferré, Ty P A

    2018-05-01

    Hydrological models are often set up to provide specific forecasts of interest. Owing to the inherent uncertainty in data used to derive model structure and used to constrain parameter variations, the model forecasts will be uncertain. Additional data collection is often performed to minimize this forecast uncertainty. Given our common financial restrictions, it is critical that we identify data with maximal information content with respect to forecast of interest. In practice, this often devolves to qualitative decisions based on expert opinion. However, there is no assurance that this will lead to optimal design, especially for complex hydrogeological problems. Specifically, these complexities include considerations of multiple forecasts, shared information among potential observations, information content of existing data, and the assumptions and simplifications underlying model construction. In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when selecting future measurement sets. © 2017, National Ground Water Association.

  18. Reliable design of a closed loop supply chain network under uncertainty: An interval fuzzy possibilistic chance-constrained model

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman

    2013-06-01

    This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.

  19. Enhancing future resilience in urban drainage system: Green versus grey infrastructure.

    PubMed

    Dong, Xin; Guo, Hao; Zeng, Siyu

    2017-11-01

    In recent years, the concept transition from fail-safe to safe-to-fail makes the application of resilience analysis popular in urban drainage systems (UDSs) with various implications and quantifications. However, most existing definitions of UDSs resilience are confined to the severity of flooding, while uncertainties from climate change and urbanization are not considered. In this research, we take into account the functional variety, topological complexity, and disturbance randomness of UDSs and define a new formula of resilience based on three parts of system severity, i.e. social severity affected by urban flooding, environmental severity caused by sewer overflow, and technological severity considering the safe operation of downstream facilities. A case study in Kunming, China is designed to compare the effect of green and grey infrastructure strategies on the enhancement of system resilience together with their costs. Different system configurations with green roofs, permeable pavement and storage tanks are compared by scenario analysis with full consideration of future uncertainties induced by urbanization and climate change. The research contributes to the development of sustainability assessment of urban drainage system with consideration of the resilience of green and grey infrastructure under future change. Finding the response measures with high adaptation across a variety of future scenarios is crucial to establish sustainable urban drainage system in a long term. Copyright © 2017. Published by Elsevier Ltd.

  20. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  1. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  2. Addressing Risk in the Valuation of Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Hammerstrom, Donald J.; Woodward, James T.

    2017-06-26

    Valuation is a mechanism by which potential worth of a transaction between two or more parties can be evaluated. Examples include valuation of transactive energy systems such as electric power system and building energy systems. Uncertainties can manifest while exercising a valuation methodology in the form of lack of knowledge or be inherently embedded in the valuation process. Uncertainty could also exist in the temporal dimension while planning for long-term growth. This paper discusses risk considerations associated with valuation studies in support of decision-making in the presence of such uncertainties. It is often important to have foresight of uncertain entitiesmore » that can impact real-world deployments, such as the comparison or ranking of two valuation studies to determine cost-benefit impacts to multiple stakeholders. The research proposes to address this challenge through simulation and sensitivity analyses to support ‘what-if’ analysis of well-defined future scenarios. This paper describes foundational value of diagrammatic representation techniques such as unified modeling language to understand the implications of not addressing some of the risk elements encountered during the valuation process. The paper includes examples from generation resource adequacy assessment studies (e.g. loss of load) to illustrate the principles of risk in valuation.« less

  3. Density estimates of monarch butterflies overwintering in central Mexico

    PubMed Central

    Diffendorfer, Jay E.; López-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice X.; Semmens, Darius; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations. PMID:28462031

  4. Density estimates of monarch butterflies overwintering in central Mexico

    USGS Publications Warehouse

    Thogmartin, Wayne E.; Diffendorfer, James E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John M.; Semmens, Brice X.; Semmens, Darius J.; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.

  5. Estimates of annual survival, growth, and recruitment of a white-tailed ptarmigan population in Colorado over 43 years

    USGS Publications Warehouse

    Wann, Greg; Aldridge, Cameron L.; Braun, Clait E.

    2014-01-01

    Long-term datasets for high-elevation species are rare, and considerable uncertainty exists in understanding how high-elevation populations have responded to recent climate warming. We present estimates of demographic vital rates from a 43-year population study of white-tailed ptarmigan (Lagopus leucura), a species endemic to alpine habitats in western North America. We used capture-recapture models to estimate annual rates of apparent survival, population growth, and recruitment for breeding-age ptarmigan, and we fit winter weather covariates to models in an attempt to explain annual variation. There were no trends in survival over the study period but there was strong support for age and sex effects. The average rate of annual growth suggests a relatively stable breeding-age population ( λ ¯ = 1.036), but there was considerable variation between years for both population growth and recruitment rates. Winter weather covariates only explained a small amount of variation in female survival and were not an important predictor of male survival. Cumulative winter precipitation was found to have a quadratic effect on female survival, with survival being highest during years of average precipitation. Cumulative winter precipitation was positively correlated with population growth and recruitment rates, although this covariate only explained a small amount of annual variation in these rates and there was considerable uncertainty among the models tested. Our results provide evidence for an alpine-endemic population that has not experienced extirpation or drastic declines. However, more information is needed to understand risks and vulnerabilities of warming effects on juveniles as our analysis was confined to determination of vital rates for breeding-age birds.

  6. Is implementation of the 2013 Australian treatment guidelines for posttraumatic stress disorder cost-effective compared to current practice? A cost-utility analysis using QALYs and DALYs.

    PubMed

    Mihalopoulos, Cathrine; Magnus, Anne; Lal, Anita; Dell, Lisa; Forbes, David; Phelps, Andrea

    2015-04-01

    To assess, from a health sector perspective, the incremental cost-effectiveness of three treatment recommendations in the most recent Australian Clinical Practice Guidelines for posttraumatic stress disorder (PTSD). The interventions assessed are trauma-focused cognitive behavioural therapy (TF-CBT) and selective serotonin reuptake inhibitors (SSRIs) for the treatment of PTSD in adults and TF-CBT in children, compared to current practice in Australia. Economic modelling, using existing databases and published information, was used to assess cost-effectiveness. A cost-utility framework using both quality-adjusted life-years (QALYs) gained and disability-adjusted life-years (DALYs) averted was used. Costs were tracked for the duration of the respective interventions and applied to the estimated 12 months prevalent cases of PTSD in the Australian population of 2012. Simulation modelling was used to provide 95% uncertainty around the incremental cost-effectiveness ratios. Consideration was also given to factors not considered in the quantitative analysis but could determine the likely uptake of the proposed intervention guidelines. TF-CBT is highly cost-effective compared to current practice at $19,000/QALY, $16,000/DALY in adults and $8900/QALY, $8000/DALY in children. In adults, 100% of uncertainty iterations fell beneath the $50,000/QALY or DALY value-for-money threshold. Using SSRIs in people already on medications is cost-effective at $200/QALY, but has considerable uncertainty around the costs and benefits. While there is a 13% chance of health loss there is a 27% chance of the intervention dominating current practice by both saving dollars and improving health in adults. The three Guideline recommended interventions evaluated in this study are likely to have a positive impact on the economic efficiency of the treatment of PTSD if adopted in full. While there are gaps in the evidence base, policy-makers can have considerable confidence that the recommendations assessed in the current study are likely to improve the efficiency of the mental health care sector. © The Royal Australian and New Zealand College of Psychiatrists 2014.

  7. [Ethics and prevention: environmental and individual disparities].

    PubMed

    Ricciardi, Claudio

    2004-01-01

    The complex interactions which exist between environmental variabilities, genetic susceptibility of population subgroups, and high individual variability for age, sex, gender, ethnicity and general status of health, acquire an ever-increasing bioethical significance. Different risk conditions caused by toxic environmental agents and environmental inequities and inequalities are increasingly evident. "Social determinants" of health increase the probability of health effects and an effective intervention of prevision and prevention for environmental pathologies is needed. The debate on environmental inequalities caused by cultural, social and economic factors and the uncertainty about possible prevention emphasize the limits of the "bio-medical model". Ethics with its further anthropological and philosophical considerations may strongly help to understand the relationship between environmental pollution and health.

  8. Cyberspace Security Econometrics System (CSES)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-07-27

    Information security continues to evolve in response to disruptive changes with a persistent focus on information-centric controls and a healthy debate about balancing endpoint and network protection, with a goal of improved enterprise/business risk management. Economic uncertainty, intensively collaborative styles of work, virtualization, increased outsourcing and ongoing complance pressures require careful consideration and adaption. The CSES provides a measure (i.e. a quantitative indication) of reliability, performance, and/or safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders' interests in that requirement. For a given stakeholder, CSES accounts for the variancemore » that may exist among the stakes one attaches to meeting each requirement.« less

  9. Challenges and regulatory considerations in the acoustic measurement of high-frequency (>20 MHz) ultrasound.

    PubMed

    Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M

    2013-11-01

    This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.

  10. Adjoint-Based Implicit Uncertainty Analysis for Figures of Merit in a Laser Inertial Fusion Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, J E; Fratoni, M; Kramer, K J

    A primary purpose of computational models is to inform design decisions and, in order to make those decisions reliably, the confidence in the results of such models must be estimated. Monte Carlo neutron transport models are common tools for reactor designers. These types of models contain several sources of uncertainty that propagate onto the model predictions. Two uncertainties worthy of note are (1) experimental and evaluation uncertainties of nuclear data that inform all neutron transport models and (2) statistical counting precision, which all results of a Monte Carlo codes contain. Adjoint-based implicit uncertainty analyses allow for the consideration of anymore » number of uncertain input quantities and their effects upon the confidence of figures of merit with only a handful of forward and adjoint transport calculations. When considering a rich set of uncertain inputs, adjoint-based methods remain hundreds of times more computationally efficient than Direct Monte-Carlo methods. The LIFE (Laser Inertial Fusion Energy) engine is a concept being developed at Lawrence Livermore National Laboratory. Various options exist for the LIFE blanket, depending on the mission of the design. The depleted uranium hybrid LIFE blanket design strives to close the fission fuel cycle without enrichment or reprocessing, while simultaneously achieving high discharge burnups with reduced proliferation concerns. Neutron transport results that are central to the operation of the design are tritium production for fusion fuel, fission of fissile isotopes for energy multiplication, and production of fissile isotopes for sustained power. In previous work, explicit cross-sectional uncertainty analyses were performed for reaction rates related to the figures of merit for the depleted uranium hybrid LIFE blanket. Counting precision was also quantified for both the figures of merit themselves and the cross-sectional uncertainty estimates to gauge the validity of the analysis. All cross-sectional uncertainties were small (0.1-0.8%), bounded counting uncertainties, and were precise with regard to counting precision. Adjoint/importance distributions were generated for the same reaction rates. The current work leverages those adjoint distributions to transition from explicit sensitivities, in which the neutron flux is constrained, to implicit sensitivities, in which the neutron flux responds to input perturbations. This treatment vastly expands the set of data that contribute to uncertainties to produce larger, more physically accurate uncertainty estimates.« less

  11. A Case for Revisiting the Safety of Pesticides: A Closer Look at Neurodevelopment

    PubMed Central

    Colborn, Theo

    2006-01-01

    The quality and quantity of the data about the risk posed to humans by individual pesticides vary considerably. Unlike obvious birth defects, most developmental effects cannot be seen at birth or even later in life. Instead, brain and nervous system disturbances are expressed in terms of how an individual behaves and functions, which can vary considerably from birth through adulthood. In this article I challenge the protective value of current pesticide risk assessment strategies in light of the vast numbers of pesticides on the market and the vast number of possible target tissues and end points that often differ depending upon timing of exposure. Using the insecticide chlorpyrifos as a model, I reinforce the need for a new approach to determine the safety of all pesticide classes. Because of the uncertainty that will continue to exist about the safety of pesticides, it is apparent that a new regulatory approach to protect human health is needed. PMID:16393651

  12. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  13. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.

  14. Decentralized finite-time attitude synchronization for multiple rigid spacecraft via a novel disturbance observer.

    PubMed

    Zong, Qun; Shao, Shikai

    2016-11-01

    This paper investigates decentralized finite-time attitude synchronization for a group of rigid spacecraft by using quaternion with the consideration of environmental disturbances, inertia uncertainties and actuator saturation. Nonsingular terminal sliding mode (TSM) is used for controller design. Firstly, a theorem is proven that there always exists a kind of TSM that converges faster than fast terminal sliding mode (FTSM) for quaternion-descripted attitude control system. Controller with this kind of TSM has faster convergence and reduced computation than FTSM controller. Then, combining with an adaptive parameter estimation strategy, a novel terminal sliding mode disturbance observer is proposed. The proposed disturbance observer needs no upper bound information of the lumped uncertainties or their derivatives. On the basis of undirected topology and the disturbance observer, decentralized attitude synchronization control laws are designed and all attitude errors are ensured to converge to small regions in finite time. As for actuator saturation problem, an auxiliary variable is introduced and accommodated by the disturbance observer. Finally, simulation results are given and the effectiveness of the proposed control scheme is testified. Copyright © 2016. Published by Elsevier Ltd.

  15. An approach toward incorporation of global warming effects into Intensity-Duration-Frequency values

    NASA Astrophysics Data System (ADS)

    Kunkel, K.; Easterling, D. R.

    2017-12-01

    Rising global temperatures from increasing greenhouse gas concentrations will increase overall atmospheric water vapor concentrations. There is a high level of scientific confidence that this will increase the future intensity and frequency of extreme precipitation events, even in regions where overall precipitation may decrease. For control of runoff from extreme rainfall, infrastructure engineering utilizes design values of rainfall known as Intensity-Duration-Frequency (IDF) values. Use of the existing IDF values, which are based solely on historical climate records, is likely to lead to under-design of runoff control structures, and associated increased flood damages. However, future changes in IDF values are uncertain and probably regionally variable. Our paradigm is that changes in IDF values will result from changes in atmospheric capacity (water vapor concentrations) and opportunity (the number and intensity of heavy precipitation-producing storm systems). Relevant storm systems being investigated include extratropical cyclones and their associated fronts, tropical cyclones, and the North American Monsoon system. The overall approach involves developing IDF adjustment factors for changes in these components of the climate system. The adjustment factors have associated uncertainties, primarily from (1) uncertainties in the future pathway of greenhouse gas emissions and (2) variations among climate models in the sensitivity of the climate system to greenhouse gas concentration changes. In addition to meteorological considerations, the lifetime of projects designed using IDF values is an essential consideration because the IDF values may change substantially during that time. The initial results of this project will be discussed.

  16. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  17. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  18. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  19. An alternative expression to the Sackur-Tetrode entropy formula for an ideal gas

    NASA Astrophysics Data System (ADS)

    Nagata, Shoichi

    2018-03-01

    An expression for the entropy of a monoatomic classical ideal gas is known as the Sackur-Tetrode equation. This pioneering investigation about 100 years ago incorporates quantum considerations. The purpose of this paper is to provide an alternative expression for the entropy in terms of the Heisenberg uncertainty relation. The analysis is made on the basis of fluctuation theory, for a canonical system in thermal equilibrium at temperature T. This new formula indicates manifestly that the entropy of macroscopic world is recognized as a measure of uncertainty in microscopic quantum world. The entropy in the Sackur-Tetrode equation can be re-interpreted from a different perspective viewpoint. The emphasis is on the connection between the entropy and the uncertainty relation in quantum consideration.

  20. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  1. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  2. Radiation Effects and Protection for Moon and Mars Missions

    NASA Technical Reports Server (NTRS)

    Parnell, Thomas A.; Watts, John W., Jr.; Armstrong, Tony W.

    1998-01-01

    Manned and robotic missions to the Earth's moon and Mars are exposed to a continuous flux of Galactic Cosmic Rays (GCR) and occasional, but intense, fluxes of Solar Energetic Particles (SEP). These natural radiations impose hazards to manned exploration, but also present some constraints to the design of robotic missions. The hazards to interplanetary flight crews and their uncertainties have been studied recently by a National Research Council Committee (Space Studies Board 1996). Considering the present uncertainty estimates, thick spacecraft shielding would be needed for manned missions, some of which could be accomplished with onboard equipment and expendables. For manned and robotic missions, the effects of radiation on electronics, sensors, and controls require special consideration in spacecraft design. This paper describes the GCR and SEP particle fluxes, secondary particles behind shielding, uncertainties in radiobiological effects and their impact on manned spacecraft design, as well as the major effects on spacecraft equipment. The principal calculational tools and considerations to mitigate the radiation effects are discussed, and work in progress to reduce uncertainties is included.

  3. Parameter Uncertainties for a 10-Meter Ground-Based Optical Reception Station

    NASA Technical Reports Server (NTRS)

    Shaik, K.

    1990-01-01

    Performance uncertainties for a 10-m optical reception station may arise from the nature of the communications channel or from a specific technology choice. Both types of uncertainties are described in this article to develop an understanding of the limitations imposed by them and to provide a rational basis for making technical decisions. The performance at night will be considerably higher than for daytime reception.

  4. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  5. Towards an Australian ensemble streamflow forecasting system for flood prediction and water management

    NASA Astrophysics Data System (ADS)

    Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.

    2016-12-01

    Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.

  6. Development of risk management strategies for state DOTs to effectively deal with volatile prices of transportation construction materials.

    DOT National Transportation Integrated Search

    2014-06-01

    Volatility in price of critical materials used in transportation projects, such as asphalt cement, leads to : considerable uncertainty about project cost. This uncertainty may lead to price speculation and inflated : bid prices submitted by highway c...

  7. The Impact of Uncertainty and Irreversibility on Investments in Online Learning

    ERIC Educational Resources Information Center

    Oslington, Paul

    2004-01-01

    Uncertainty and irreversibility are central to online learning projects, but have been neglected in the existing educational cost-benefit analysis literature. This paper builds some simple illustrative models of the impact of irreversibility and uncertainty, and shows how different types of cost and demand uncertainty can have substantial impacts…

  8. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  9. The 3-D structure of the Somma-Vesuvius volcanic complex (Italy) inferred from new and historic gravimetric data.

    PubMed

    Linde, Niklas; Ricci, Tullio; Baron, Ludovic; Shakas, Alexis; Berrino, Giovanna

    2017-08-16

    Existing 3-D density models of the Somma-Vesuvius volcanic complex (SVVC), Italy, largely disagree. Despite the scientific and socioeconomic importance of Vesuvius, there is no reliable 3-D density model of the SVVC. A considerable uncertainty prevails concerning the presence (or absence) of a dense body underlying the Vesuvius crater (1944 eruption) that is implied from extensive seismic investigations. We have acquired relative gravity measurements at 297 stations, including measurements in difficult-to-access areas (e.g., the first-ever measurements in the crater). In agreement with seismic investigations, the simultaneous inversion of these and historic data resolves a high-density body that extends from the surface of the Vesuvius crater down to depths that exceed 2 km. A 1.5-km radius horseshoe-shaped dense feature (open in the southwestern sector) enforces the existing model of groundwater circulation within the SVVC. Based on its volcano-tectonic evolution, we interpret volcanic structures that have never been imaged before.

  10. Independent data monitoring committees: Preparing a path for the future

    PubMed Central

    Hess, Connie N.; Roe, Matthew T.; Gibson, C. Michael; Temple, Robert J.; Pencina, Michael J.; Zarin, Deborah A.; Anstrom, Kevin J.; Alexander, John H.; Sherman, Rachel E.; Fiedorek, Fred T.; Mahaffey, Kenneth W.; Lee, Kerry L.; Chow, Shein-Chung; Armstrong, Paul W.; Califf, Robert M.

    2014-01-01

    Independent data monitoring committees (IDMCs) were introduced to monitor patient safety and study conduct in randomized clinical trials (RCTs), but certain challenges regarding the utilization of IDMCs have developed. First, the roles and responsibilities of IDMCs are expanding, perhaps due to increasing trial complexity and heterogeneity regarding medical, ethical, legal, regulatory, and financial issues. Second, no standard for IDMC operating procedures exists, and there is uncertainty about who should determine standards and whether standards should vary with trial size and design. Third, considerable variability in communication pathways exist across IDMC interfaces with regulatory agencies, academic coordinating centers, and sponsors. Finally, there has been a substantial increase in the number of RCTs using IDMCs, yet there is no set of qualifications to help guide the training and development of the next generation of IDMC members. Recently, an expert panel of representatives from government, industry, and academia assembled at the Duke Clinical Research Institute to address these challenges and to develop recommendations for the future utilization of IDMCs in RCTs. PMID:25066551

  11. Uncertainty considerations in calibration and validation of hydrologic and water quality models

    USDA-ARS?s Scientific Manuscript database

    Hydrologic and water quality models (HWQMs) are increasingly used to support decisions on the state of various environmental issues and policy directions on present and future scenarios, at scales varying from watershed to continental levels. Uncertainty associated with such models may impact the ca...

  12. Patterns of zone management uncertainty in cotton using tarnished plant bug distributions, NDVI, soil EC, yield and thermal imagery

    USDA-ARS?s Scientific Manuscript database

    Management zones for various crops have been delineated using NDVI (Normalized Difference Vegetation Index), apparent bulk soil electrical conductivity (ECa - Veris), and yield data; however, estimations of uncertainty for these data layers are equally important considerations. The objective of this...

  13. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  14. Covariance propagation in spectral indices

    DOE PAGES

    Griffin, P. J.

    2015-01-09

    In this study, the dosimetry community has a history of using spectral indices to support neutron spectrum characterization and cross section validation efforts. An important aspect to this type of analysis is the proper consideration of the contribution of the spectrum uncertainty to the total uncertainty in calculated spectral indices (SIs). This study identifies deficiencies in the traditional treatment of the SI uncertainty, provides simple bounds to the spectral component in the SI uncertainty estimates, verifies that these estimates are reflected in actual applications, details a methodology that rigorously captures the spectral contribution to the uncertainty in the SI, andmore » provides quantified examples that demonstrate the importance of the proper treatment the spectral contribution to the uncertainty in the SI.« less

  15. Uncertainty loops in travel-time tomography from nonlinear wave physics.

    PubMed

    Galetti, Erica; Curtis, Andrew; Meles, Giovanni Angelo; Baptie, Brian

    2015-04-10

    Estimating image uncertainty is fundamental to guiding the interpretation of geoscientific tomographic maps. We reveal novel uncertainty topologies (loops) which indicate that while the speeds of both low- and high-velocity anomalies may be well constrained, their locations tend to remain uncertain. The effect is widespread: loops dominate around a third of United Kingdom Love wave tomographic uncertainties, changing the nature of interpretation of the observed anomalies. Loops exist due to 2nd and higher order aspects of wave physics; hence, although such structures must exist in many tomographic studies in the physical sciences and medicine, they are unobservable using standard linearized methods. Higher order methods might fruitfully be adopted.

  16. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    NASA Astrophysics Data System (ADS)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  17. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  18. Management of California Oak Woodlands: Uncertainties and Modeling

    Treesearch

    Jay E. Noel; Richard P. Thompson

    1995-01-01

    A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...

  19. The Influence of Weight-of-Evidence Messages on (Vaccine) Attitudes: A Sequential Mediation Model.

    PubMed

    Clarke, Christopher E; Weberling McKeever, Brooke; Holton, Avery; Dixon, Graham N

    2015-01-01

    Media coverage of contentious risk issues often features competing claims about whether a risk exists and what scientific evidence shows, and journalists often cover these issues by presenting both sides. However, for topics defined by scientific agreement, balanced coverage erroneously heightens uncertainty about scientific information and the issue itself. In this article, we extend research on combating so-called information and issue uncertainty using weight of evidence, drawing on the discredited autism-vaccine link as a case study. We examine whether people's perceptions of issue uncertainty (about whether a link exists) change before and after they encounter a news message with weight-of-evidence information. We also explore whether message exposure is associated with broader issue judgments, specifically vaccine attitudes. Participants (n = 181) read news articles that included or omitted weight-of-evidence content stating that scientific studies have found no link and that scientists agree that none exists. Postexposure issue uncertainty decreased-in other words, issue certainty increased-from preexposure levels across all conditions. Moreover, weight-of-evidence messages were associated with positive vaccine attitudes indirectly via reduced information uncertainty (i.e., one's belief that scientific opinion and evidence concerning a potential link is unclear) as well as issue uncertainty. We discuss implications for risk communication.

  20. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  1. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    NASA Astrophysics Data System (ADS)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.

  2. Uncertainty in estimates of the number of extraterrestrial civilizations

    NASA Technical Reports Server (NTRS)

    Sturrock, P. A.

    1980-01-01

    An estimation of the number N of communicative civilizations is made by means of Drake's formula which involves the combination of several quantities, each of which is to some extent uncertain. It is shown that the uncertainty in any quantity may be represented by a probability distribution function, even if that quantity is itself a probability. The uncertainty of current estimates of N is derived principally from uncertainty in estimates of the lifetime of advanced civilizations. It is argued that this is due primarily to uncertainty concerning the existence of a Galactic Federation which is in turn contingent upon uncertainty about whether the limitations of present-day physics are absolute or (in the event that there exists a yet undiscovered hyperphysics) transient. It is further argued that it is advantageous to consider explicitly these underlying assumptions in order to compare the probable numbers of civilizations operating radio beacons, permitting radio leakage, dispatching probes for radio surveillance for dispatching vehicles for manned surveillance.

  3. Music and literature: are there shared empathy and predictive mechanisms underlying their affective impact?

    PubMed Central

    Omigie, Diana

    2015-01-01

    It has been suggested that music and language had a shared evolutionary precursor before becoming mainly responsible for the communication of emotive and referential meaning respectively. However, emphasis on potential differences between music and language may discourage a consideration of the commonalities that music and literature share. Indeed, one possibility is that common mechanisms underlie their affective impact, and the current paper carefully reviews relevant neuroscientific findings to examine such a prospect. First and foremost, it will be demonstrated that considerable evidence of a common role of empathy and predictive processes now exists for the two domains. However, it will also be noted that an important open question remains: namely, whether the mechanisms underlying the subjective experience of uncertainty differ between the two domains with respect to recruitment of phylogenetically ancient emotion areas. It will be concluded that a comparative approach may not only help to reveal general mechanisms underlying our responses to music and literature, but may also help us better understand any idiosyncrasies in their capacity for affective impact. PMID:26379583

  4. The complexity of child protection recurrence: The case for a systems approach.

    PubMed

    Jenkins, Brian Q; Tilbury, Clare; Mazerolle, Paul; Hayes, Hennessey

    2017-01-01

    Research on child protection recurrence has found consistent child, family, and case characteristics associated with repeated involvement with the child protection system. Despite the considerable body of empirical research, knowledge about why recurrence occurs, and what can be done to reduce it, is limited. This paper reviews the empirical literature and analyses the approaches of prior recurrence research. Four related conceptual challenges are identified: (1) a tendency to conflate child protection recurrence with repeated child maltreatment; (2) uncertainty about how best to operationalize and measure child protection recurrence in research; (3) inconsistency between prevailing explanations for the most frequently observed patterns of recurrence; and (4) difficulty in developing coherent strategies to address child protection recurrence based on research. Addressing these challenges requires a greater consideration of the effects of decision-making in the child protection system on recurrence. This paper proposes a methodology based in systems theory and drawing on existing administrative data to examine the characteristics of the child protection system that may also produce recurrence. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Estimating the Health Effects of Greenhouse Gas Mitigation Strategies: Addressing Parametric, Model, and Valuation Challenges

    PubMed Central

    Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid

    2014-01-01

    Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270

  6. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  7. Projecting future air pollution-related mortality under a changing climate: progress, uncertainties and research needs.

    PubMed

    Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu

    2015-02-01

    Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  8. Model-Based Fatigue Prognosis of Fiber-Reinforced Laminates Exhibiting Concurrent Damage Mechanisms

    NASA Technical Reports Server (NTRS)

    Corbetta, M.; Sbarufatti, C.; Saxena, A.; Giglio, M.; Goebel, K.

    2016-01-01

    Prognostics of large composite structures is a topic of increasing interest in the field of structural health monitoring for aerospace, civil, and mechanical systems. Along with recent advancements in real-time structural health data acquisition and processing for damage detection and characterization, model-based stochastic methods for life prediction are showing promising results in the literature. Among various model-based approaches, particle-filtering algorithms are particularly capable in coping with uncertainties associated with the process. These include uncertainties about information on the damage extent and the inherent uncertainties of the damage propagation process. Some efforts have shown successful applications of particle filtering-based frameworks for predicting the matrix crack evolution and structural stiffness degradation caused by repetitive fatigue loads. Effects of other damage modes such as delamination, however, are not incorporated in these works. It is well established that delamination and matrix cracks not only co-exist in most laminate structures during the fatigue degradation process but also affect each other's progression. Furthermore, delamination significantly alters the stress-state in the laminates and accelerates the material degradation leading to catastrophic failure. Therefore, the work presented herein proposes a particle filtering-based framework for predicting a structure's remaining useful life with consideration of multiple co-existing damage-mechanisms. The framework uses an energy-based model from the composite modeling literature. The multiple damage-mode model has been shown to suitably estimate the energy release rate of cross-ply laminates as affected by matrix cracks and delamination modes. The model is also able to estimate the reduction in stiffness of the damaged laminate. This information is then used in the algorithms for life prediction capabilities. First, a brief summary of the energy-based damage model is provided. Then, the paper describes how the model is embedded within the prognostic framework and how the prognostics performance is assessed using observations from run-to-failure experiments

  9. Bayesian assessment of the expected data impact on prediction confidence in optimal sampling design

    NASA Astrophysics Data System (ADS)

    Leube, P. C.; Geiges, A.; Nowak, W.

    2012-02-01

    Incorporating hydro(geo)logical data, such as head and tracer data, into stochastic models of (subsurface) flow and transport helps to reduce prediction uncertainty. Because of financial limitations for investigation campaigns, information needs toward modeling or prediction goals should be satisfied efficiently and rationally. Optimal design techniques find the best one among a set of investigation strategies. They optimize the expected impact of data on prediction confidence or related objectives prior to data collection. We introduce a new optimal design method, called PreDIA(gnosis) (Preposterior Data Impact Assessor). PreDIA derives the relevant probability distributions and measures of data utility within a fully Bayesian, generalized, flexible, and accurate framework. It extends the bootstrap filter (BF) and related frameworks to optimal design by marginalizing utility measures over the yet unknown data values. PreDIA is a strictly formal information-processing scheme free of linearizations. It works with arbitrary simulation tools, provides full flexibility concerning measurement types (linear, nonlinear, direct, indirect), allows for any desired task-driven formulations, and can account for various sources of uncertainty (e.g., heterogeneity, geostatistical assumptions, boundary conditions, measurement values, model structure uncertainty, a large class of model errors) via Bayesian geostatistics and model averaging. Existing methods fail to simultaneously provide these crucial advantages, which our method buys at relatively higher-computational costs. We demonstrate the applicability and advantages of PreDIA over conventional linearized methods in a synthetic example of subsurface transport. In the example, we show that informative data is often invisible for linearized methods that confuse zero correlation with statistical independence. Hence, PreDIA will often lead to substantially better sampling designs. Finally, we extend our example to specifically highlight the consideration of conceptual model uncertainty.

  10. Reducing Multisensor Satellite Monthly Mean Aerosol Optical Depth Uncertainty: 1. Objective Assessment of Current AERONET Locations

    NASA Technical Reports Server (NTRS)

    Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki

    2016-01-01

    Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty reduction, indicating its representativeness level.

  11. Shale Gas Development and Brook Trout: Scaling Best Management Practices to Anticipate Cumulative Effects

    USGS Publications Warehouse

    Smith, David; Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.; Faulkner, Stephen P.

    2012-01-01

    Shale gas development may involve trade-offs between energy development and benefits provided by natural ecosystems. However, current best management practices (BMPs) focus on mitigating localized ecological degradation. We review evidence for cumulative effects of natural gas development on brook trout (Salvelinus fontinalis) and conclude that BMPs should account for potential watershed-scale effects in addition to localized influences. The challenge is to develop BMPs in the face of uncertainty in the predicted response of brook trout to landscape-scale disturbance caused by gas extraction. We propose a decision-analysis approach to formulating BMPs in the specific case of relatively undisturbed watersheds where there is consensus to maintain brook trout populations during gas development. The decision analysis was informed by existing empirical models that describe brook trout occupancy responses to landscape disturbance and set bounds on the uncertainty in the predicted responses to shale gas development. The decision analysis showed that a high efficiency of gas development (e.g., 1 well pad per square mile and 7 acres per pad) was critical to achieving a win-win solution characterized by maintaining brook trout and maximizing extraction of available gas. This finding was invariant to uncertainty in predicted response of brook trout to watershed-level disturbance. However, as the efficiency of gas development decreased, the optimal BMP depended on the predicted response, and there was considerable potential value in discriminating among predictive models through adaptive management or research. The proposed decision-analysis framework provides an opportunity to anticipate the cumulative effects of shale gas development, account for uncertainty, and inform management decisions at the appropriate spatial scales.

  12. Modeling the uncertainty of estimating forest carbon stocks in China

    NASA Astrophysics Data System (ADS)

    Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.

    2015-12-01

    Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.

  13. A long and winding road; evolution of antimicrobial drug development - crisis management.

    PubMed

    Echols, Roger M

    2012-11-01

    The development of antimicrobial drugs has evolved from observational case reports to complex randomized prospective clinical trials in specific treatment indications. Beginning around the year 2000, the US FDA has evolved its approach on study design and other study characteristics, which has made the conduct of these studies more difficult and the outcomes for sponsors more risky. This has contributed to the decline in the discovery and development of new antimicrobials, which are needed to address the increasing problem of bacterial resistance to existing marketed products. This study reviews the historical basis for the current regulatory climate including the various crises that have led to considerable political pressures on the agency. Recent efforts to resolve development uncertainties and to provide economic incentives for future antimicrobial drug development are presented.

  14. Review of health and productivity gains from better IEQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, William J.

    2000-08-01

    The available scientific data suggest that existing technologies and procedures can improve indoor environmental quality (IEQ) in a manner that significantly increases productivity and health. While there is considerable uncertainty in the estimates of the magnitudes of productivity gains that may be obtained, the projected gains are very large. For the U.S., the estimated potential annual savings and productivity gains are $6 to $14 billion from reduced respiratory disease, $2 to $4 billion from reduced allergies and asthma, $10 to $30 billion from reduced sick building syndrome symptoms, and $20 to $160 billion from direct improvements in worker performance thatmore » are unrelated to health. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  15. The promise of complementarity: Using the methods of foresight for health workforce planning.

    PubMed

    Rees, Gareth H; Crampton, Peter; Gauld, Robin; MacDonell, Stephen

    2018-05-01

    Health workforce planning aims to meet a health system's needs with a sustainable and fit-for-purpose workforce, although its efficacy is reduced in conditions of uncertainty. This PhD breakthrough article offers foresight as a means of addressing this uncertainty and models its complementarity in the context of the health workforce planning problem. The article summarises the findings of a two-case multi-phase mixed method study that incorporates actor analysis, scenario development and policy Delphi. This reveals a few dominant actors of considerable influence who are in conflict over a few critical workforce issues. Using these to augment normative scenarios, developed from existing clinically developed model of care visions, a number of exploratory alternative descriptions of future workforce situations are produced for each case. Their analysis reveals that these scenarios are a reasonable facsimile of plausible futures, though some are favoured over others. Policy directions to support these favoured aspects can also be identified. This novel approach offers workforce planners and policy makers some guidance on the use of complimentary data, methods to overcome the limitations of conventional workforce forecasting and a framework for exploring the complexities and ambiguities of a health workforce's evolution.

  16. Validation plays the role of a "bridge" in connecting remote sensing research and applications

    NASA Astrophysics Data System (ADS)

    Wang, Zhiqiang; Deng, Ying; Fan, Yida

    2018-07-01

    Remote sensing products contribute to improving earth observations over space and time. Uncertainties exist in products of different levels; thus, validation of these products before and during their applications is critical. This study discusses the meaning of validation in depth and proposes a new definition of reliability for use with such products. In this context, validation should include three aspects: a description of the relevant uncertainties, quantitative measurement results and a qualitative judgment that considers the needs of users. A literature overview is then presented evidencing improvements in the concepts associated with validation. It shows that the root mean squared error (RMSE) is widely used to express accuracy; increasing numbers of remote sensing products have been validated; research institutes contribute most validation efforts; and sufficient validation studies encourage the application of remote sensing products. Validation plays a connecting role in the distribution and application of remote sensing products. Validation connects simple remote sensing subjects with other disciplines, and it connects primary research with practical applications. Based on the above findings, it is suggested that validation efforts that include wider cooperation among research institutes and full consideration of the needs of users should be promoted.

  17. Challenges of Sustaining the International Space Station Through 2020 and Beyond: Reassessing Confidence Targets for System Availability

    NASA Technical Reports Server (NTRS)

    Lutomski, Michael G.; Carter-Journet, Katrina; Anderson, Leif; Box, Neil; Harrington, Sean; Jackson, David; DiFilippo, Denise

    2012-01-01

    The International Space Station (ISS) was originally designed to operate until 2015 with a plan for deorbiting the ISS in 2016. Currently, the international partnership has agreed to extend the operations until 2020 and discussions are underway to extend the life even further to 2028. Each partner is responsible for the sustaining engineering, sparing, and maintenance of their own segments. National Aeronautics and Space Administration's (NASA's) challenge is to purchase the needed number of spares to maintain the functional availability of the ISS systems necessary for the United States On-Orbit Segment s contribution. This presentation introduces an analytical approach to assessing uncertainty in ISS hardware necessary to extend the life of the vehicle. Some key areas for consideration are: establishing what confidence targets are required to ensure science can be continuously carried out on the ISS, defining what confidence targets are reasonable to ensure vehicle survivability, considering what is required to determine if the confidence targets are too high, and whether sufficient number of spares are purchased. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. This analysis compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies' availability does not meet subsystem confidence targets, the analysis will further identify which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty which must be factored into the development and execution of sparing risk postures. In addition, it is also recognized that uncertainty in the assessment is due to disconnects between modeled functions and actual subsystem operations. Perhaps most importantly, it is acknowledged that conservative confidence targets per subsystem are currently accepted. This presentation will also discuss how subsystem confidence targets may be relaxed based on calculating the level of uncertainty for each corresponding ORU-function. The presentation will conclude with the various strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life; 2020 and beyond.

  18. The devil that we know: lead (Pb) replacement policies under conditions of scientific uncertainty

    NASA Technical Reports Server (NTRS)

    Ogunseitan, Dele; Schoenung, Julie; Saphores, Jean-Daniel; Shapiro, Andrew; Bhuie, Amrit; Kang, Hai-Yong; Nixon, Hilary; Stein, Antionette

    2003-01-01

    Engineering and economic considerations are typical driving forces behind the selection of specific chemicals used in the manufacture of consumer products. Only recently has post-consumer environmental impact become part of the major considerations during the initial phases of product design. Therefore, reactive, rather than proactive strategies have dominated the consideration of environmental and health issues in product design.

  19. Incorporating anthropogenic influences into fire probability models: Effects of development and climate change on fire activity in California

    NASA Astrophysics Data System (ADS)

    Mann, M.; Moritz, M.; Batllori, E.; Waller, E.; Krawchuk, M.; Berck, P.

    2014-12-01

    The costly interactions between humans and natural fire regimes throughout California demonstrate the need to understand the uncertainties surrounding wildfire, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires. Models estimate an increase in fire occurrence between nine and fifty-three percent by the end of the century. Our goal is to assess the role of uncertainty in climate and anthropogenic influences on the state's fire regime from 2000-2050. We develop an empirical model that integrates novel information about the distribution and characteristics of future plant communities without assuming a particular distribution, and improve on previous efforts by integrating dynamic estimates of population density at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of the total fire count, and that further housing development will incite or suppress additional fires according to their intensity. We also find that the total area burned is likely to increase but at a slower than historical rate. Previous findings of substantially increased numbers of fires may be tied to the assumption of static fuel loadings, and the use of proxy variables not relevant to plant community distributions. We also find considerable agreement between GFDL and PCM model A2 runs, with decreasing fire counts expected only in areas of coastal influence below San Francisco and above Los Angeles. Due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid deserts of the inland south. The broad shifts of wildfire between California's climatic regions forecast in this study point to dramatic shifts in the pressures plant and human communities will face by midcentury. The information provided by this study reduces the level of uncertainty surrounding the influence that natural and anthropogenic systems have on wildfire.

  20. Improving Deterministic Reserve Requirements for Security Constrained Unit Commitment and Scheduling Problems in Power Systems

    NASA Astrophysics Data System (ADS)

    Wang, Fengyu

    Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.

  1. Considerations for interpreting probabilistic estimates of uncertainty of forest carbon

    Treesearch

    James E. Smith; Linda S. Heath

    2000-01-01

    Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...

  2. Consider the Alternative: The Effects of Causal Knowledge on Representing and Using Alternative Hypotheses in Judgments under Uncertainty

    ERIC Educational Resources Information Center

    Hayes, Brett K.; Hawkins, Guy E.; Newell, Ben R.

    2016-01-01

    Four experiments examined the locus of impact of causal knowledge on consideration of alternative hypotheses in judgments under uncertainty. Two possible loci were examined; overcoming neglect of the alternative when developing a representation of a judgment problem and improving utilization of statistics associated with the alternative…

  3. Introducing Decision Making under Uncertainty and Strategic Considerations in Engineering Design

    ERIC Educational Resources Information Center

    Kosmopoulou, Georgia; Jog, Chintamani; Freeman, Margaret; Papavassiliou, Dimitrios V.

    2010-01-01

    Chemical Engineering graduates will face challenges at the workplace that even their peers who graduated a few years ago were not expected to face. One such major challenge is the management and operation of companies and plants under conditions of uncertainty and the need to make decisions in competitive situations. Modern developments in…

  4. Influential input classification in probabilistic multimedia models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.

    1999-05-01

    Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions onemore » should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.« less

  5. Multisectoral climate impact hotspots in a warming world.

    PubMed

    Piontek, Franziska; Müller, Christoph; Pugh, Thomas A M; Clark, Douglas B; Deryng, Delphine; Elliott, Joshua; Colón González, Felipe de Jesus; Flörke, Martina; Folberth, Christian; Franssen, Wietse; Frieler, Katja; Friend, Andrew D; Gosling, Simon N; Hemming, Deborah; Khabarov, Nikolay; Kim, Hyungjun; Lomas, Mark R; Masaki, Yoshimitsu; Mengel, Matthias; Morse, Andrew; Neumann, Kathleen; Nishina, Kazuya; Ostberg, Sebastian; Pavlick, Ryan; Ruane, Alex C; Schewe, Jacob; Schmid, Erwin; Stacke, Tobias; Tang, Qiuhong; Tessler, Zachary D; Tompkins, Adrian M; Warszawski, Lila; Wisser, Dominik; Schellnhuber, Hans Joachim

    2014-03-04

    The impacts of global climate change on different aspects of humanity's diverse life-support systems are complex and often difficult to predict. To facilitate policy decisions on mitigation and adaptation strategies, it is necessary to understand, quantify, and synthesize these climate-change impacts, taking into account their uncertainties. Crucial to these decisions is an understanding of how impacts in different sectors overlap, as overlapping impacts increase exposure, lead to interactions of impacts, and are likely to raise adaptation pressure. As a first step we develop herein a framework to study coinciding impacts and identify regional exposure hotspots. This framework can then be used as a starting point for regional case studies on vulnerability and multifaceted adaptation strategies. We consider impacts related to water, agriculture, ecosystems, and malaria at different levels of global warming. Multisectoral overlap starts to be seen robustly at a mean global warming of 3 °C above the 1980-2010 mean, with 11% of the world population subject to severe impacts in at least two of the four impact sectors at 4 °C. Despite these general conclusions, we find that uncertainty arising from the impact models is considerable, and larger than that from the climate models. In a low probability-high impact worst-case assessment, almost the whole inhabited world is at risk for multisectoral pressures. Hence, there is a pressing need for an increased research effort to develop a more comprehensive understanding of impacts, as well as for the development of policy measures under existing uncertainty.

  6. Multisectoral Climate Impact Hotspots in a Warming World

    NASA Technical Reports Server (NTRS)

    Piontek, Franziska; Mueller, Christoph; Pugh, Thomas A. M.; Clark, Douglas B.; Deryng, Delphine; Elliott, Joshua; deJesusColonGonzalez, Felipe; Floerke, Martina; Folberth, Christian; Franssen, Wietse; hide

    2014-01-01

    The impacts of global climate change on different aspects of humanity's diverse life-support systems are complex and often difficult to predict. To facilitate policy decisions on mitigation and adaptation strategies, it is necessary to understand, quantify, and synthesize these climate-change impacts, taking into account their uncertainties. Crucial to these decisions is an understanding of how impacts in different sectors overlap, as overlapping impacts increase exposure, lead to interactions of impacts, and are likely to raise adaptation pressure. As a first step we develop herein a framework to study coinciding impacts and identify regional exposure hotspots. This framework can then be used as a starting point for regional case studies on vulnerability and multifaceted adaptation strategies. We consider impacts related to water, agriculture, ecosystems, and malaria at different levels of global warming. Multisectoral overlap starts to be seen robustly at a mean global warming of 3 degC above the 1980-2010 mean, with 11% of the world population subject to severe impacts in at least two of the four impact sectors at 4 degC. Despite these general conclusions, we find that uncertainty arising from the impact models is considerable, and larger than that from the climate models. In a low probability-high impact worst-case assessment, almost the whole inhabited world is at risk for multisectoral pressures. Hence, there is a pressing need for an increased research effort to develop a more comprehensive understanding of impacts, as well as for the development of policy measures under existing uncertainty.

  7. Adaptive Flood Risk Management Under Climate Change Uncertainty Using Real Options and Optimization.

    PubMed

    Woodward, Michelle; Kapelan, Zoran; Gouldby, Ben

    2014-01-01

    It is well recognized that adaptive and flexible flood risk strategies are required to account for future uncertainties. Development of such strategies is, however, a challenge. Climate change alone is a significant complication, but, in addition, complexities exist trying to identify the most appropriate set of mitigation measures, or interventions. There are a range of economic and environmental performance measures that require consideration, and the spatial and temporal aspects of evaluating the performance of these is complex. All these elements pose severe difficulties to decisionmakers. This article describes a decision support methodology that has the capability to assess the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. The flood risk strategies have been explicitly designed to allow for flexible adaptive measures by capturing the concepts of real options and multiobjective optimization to evaluate potential flood risk management opportunities. A state-of-the-art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time and a multiobjective genetic algorithm is utilized to search for the optimal adaptive strategies. The modeling system has been applied to a reach on the Thames Estuary (London, England), and initial results show the inclusion of flexibility is advantageous, while the outputs provide decisionmakers with supplementary knowledge that previously has not been considered. © 2013 HR Wallingford Ltd.

  8. Uncertainties in 63Ni and 55Fe determinations using liquid scintillation counting methods.

    PubMed

    Herranz, M; Idoeta, R; Abelairas, A; Legarda, F

    2012-09-01

    The implementation of (63)Ni and (55)Fe determination methods in an environmental laboratory implies their validation. In this process, the uncertainties related to these methods should be analysed. In this work, the expression of the uncertainty of the results obtained using separation methods followed by liquid scintillation counting is presented. This analysis includes the consideration of uncertainties coming from the different alternatives which these methods use as well as those which are specific to the individual laboratory and the competency of its operators in applying the standard ORISE (Oak Ridge Institute for Science and Education) methods. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Bayesian characterization of uncertainty in species interaction strengths.

    PubMed

    Wolf, Christopher; Novak, Mark; Gitelman, Alix I

    2017-06-01

    Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.

  10. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  11. Uncertainty in structural interpretation: Lessons to be learnt

    NASA Astrophysics Data System (ADS)

    Bond, Clare E.

    2015-05-01

    Uncertainty in the interpretation of geological data is an inherent element of geology. Datasets from different sources: remotely sensed seismic imagery, field data and borehole data, are often combined and interpreted to create a geological model of the sub-surface. The data have limited resolution and spatial distribution that results in uncertainty in the interpretation of the data and in the subsequent geological model(s) created. Methods to determine the extent of interpretational uncertainty of a dataset, how to capture and express that uncertainty, and consideration of uncertainties in terms of risk have been investigated. Here I review the work that has taken place and discuss best practice in accounting for uncertainties in structural interpretation workflows. Barriers to best practice are reflected on, including the use of software packages for interpretation. Experimental evidence suggests that minimising interpretation error through the use of geological reasoning and rules can help decrease interpretation uncertainty; through identification of inadmissible interpretations and in highlighting areas of uncertainty. Understanding expert thought processes and reasoning, including the use of visuospatial skills, during interpretation may aid in the identification of uncertainties, and in the education of new geoscientists.

  12. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  13. Higher Flux from the Young Sun as an Explanation for Warm Temperatures for Early Earth and Mars

    NASA Technical Reports Server (NTRS)

    Sackmann, I.-Juliana

    2001-01-01

    Observations indicate that the Earth was at least warm enough for liquid water to exist as far back as 4 Gyr ago, namely, as early as half a billion years after the formation of the Earth; in fact, there is evidence suggesting that Earth may have been even warmer then than it is now. These relatively warm temperatures required on early Earth are in apparent contradiction to the dimness of the early Sun predicted by the standard solar models. This problem has generally been explained by assuming that Earth's early atmosphere contained huge amounts of carbon dioxide (CO2), resulting in a large enough greenhouse effect to counteract the effect of a dimmer Sun. However, recent work places an upper limit of 0.04 bar on the partial pressure of CO2 in the period from 2.75 to 2.2 Gyr ago, based on the absence of siderite in paleosols; this casts doubt on the viability of a strong CO2 greenhouse effect on early Earth. The existence of liquid water on early Mars has been even more of a puzzle; even the maximum possible CO2 greenhouse effect cannot yield warm enough Martian surface temperatures. These problems can be resolved simultaneously for both Earth and Mars, if the early Sun was brighter than predicted by the standard solar models. This could be accomplished if the early Sun was slightly more massive than it is now, i.e., if the solar wind was considerably stronger in the past than at present. A slightly more massive young Sun would have left fingerprints on the internal structure of the present Sun. Today, helioseismic observations exist that can measure the internal structure of the Sun with very high precision. The task undertaken here was to compute solar models with the highest precision possible at this time, starting with slightly greater initial masses. These were evolved to the present solar age, where comparisons with the helioseismic observations could be made. Our computations also yielded the time evolution of the solar flux at the planets - a key input to the climates of early Earth and Mars. Early solar mass loss is not the only influence that can alter the internal structure of the present Sun. There are minor uncertainties in the physics of the solar models and in the key observed solar parameters that also affect the present Sun's internal structure. It was therefore imperative to obtain an understanding of the effects of these other uncertainties, in order to disentangle them from the fingerprints that might be left by early solar mass loss. From these considerations, our work was divided into two parts: (1) We first computed the evolution of standard solar models with input parameters varied within their uncertainties, to determine their effect on the observable helioseismic quantities; (2) We then computed non-standard solar models with higher initial masses to test against the helioseismological observations.

  14. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.

  15. Hydro power flexibility for power systems with variable renewable energy sources: an IEA Task 25 collaboration: Hydro power flexibility for power systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huertas-Hernando, Daniel; Farahmand, Hossein; Holttinen, Hannele

    2016-06-20

    Hydro power is one of the most flexible sources of electricity production. Power systems with considerable amounts of flexible hydro power potentially offer easier integration of variable generation, e.g., wind and solar. However, there exist operational constraints to ensure mid-/long-term security of supply while keeping river flows and reservoirs levels within permitted limits. In order to properly assess the effective available hydro power flexibility and its value for storage, a detailed assessment of hydro power is essential. Due to the inherent uncertainty of the weather-dependent hydrological cycle, regulation constraints on the hydro system, and uncertainty of internal load as wellmore » as variable generation (wind and solar), this assessment is complex. Hence, it requires proper modeling of all the underlying interactions between hydro power and the power system, with a large share of other variable renewables. A summary of existing experience of wind integration in hydro-dominated power systems clearly points to strict simulation methodologies. Recommendations include requirements for techno-economic models to correctly assess strategies for hydro power and pumped storage dispatch. These models are based not only on seasonal water inflow variations but also on variable generation, and all these are in time horizons from very short term up to multiple years, depending on the studied system. Another important recommendation is to include a geographically detailed description of hydro power systems, rivers' flows, and reservoirs as well as grid topology and congestion.« less

  16. A drag-free Lo-Lo satellite system for improved gravity field measurements

    NASA Technical Reports Server (NTRS)

    Fischell, R. E.; Pisacane, V. L.

    1978-01-01

    At very low altitudes, the effect of atmospheric drag results in drastically reduced orbit lifetimes and considerable uncertainty in satellite motions. The concept suggested herein employs a DISturbance COmpensation System (DISCOS) on each of a pair of satellites at very low altitudes to provide refined measurements of the earth's gravitational field. The DISCOS maintains the satellites in orbit and essentially eliminates motion uncertainties due mostly to drag and to a lesser extent from solar radiation pressure. By a closed-loop measurement of the relative rangerate between the two low satellites, one can determine the earth's gravitational field with a considerably greater accuracy than could be obtained by tracking a single satellite.

  17. Multimode squeezing, biphotons and uncertainty relations in polarization quantum optics

    NASA Technical Reports Server (NTRS)

    Karassiov, V. P.

    1994-01-01

    The concept of squeezing and uncertainty relations are discussed for multimode quantum light with the consideration of polarization. Using the polarization gauge SU(2) invariance of free electromagnetic fields, we separate the polarization and biphoton degrees of freedom from other ones, and consider uncertainty relations characterizing polarization and biphoton observables. As a consequence, we obtain a new classification of states of unpolarized (and partially polarized) light within quantum optics. We also discuss briefly some interrelations of our analysis with experiments connected with solving some fundamental problems of physics.

  18. GPS (Global Positioning System) Error Budgets, Accuracy and Applications Considerations for Test and Training Ranges.

    DTIC Science & Technology

    1982-12-01

    RELATIONSHIP OF POOP AND HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 3 DIMENSIONAL NAVIGATION. 4Satellite configuration ( AZEL ), (00,100), (900,10O), (180,10O...RELATIONSHIP OF HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (°,lO), (90,10), (180,lOO), (27o8...UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (00,100), (909,200), (l80*,30*), (270*,40*) 4.4-12 4.t 78 " 70 " 30F 20F 4S, a

  19. Economic policy uncertainty, equity premium and dependence between their quantiles: Evidence from quantile-on-quantile approach

    NASA Astrophysics Data System (ADS)

    Raza, Syed Ali; Zaighum, Isma; Shah, Nida

    2018-02-01

    This paper examines the relationship between economic policy uncertainty and equity premium in G7 countries over a period of the monthly data from January 1989 to December 2015 using a novel technique namely QQ regression proposed by Sim and Zhou (2015). Based on QQ approach, we estimate how the quantiles of the economic policy uncertainty affect the quantiles of the equity premium. Thus, it provides a comprehensive insight into the overall dependence structure between the equity premium and economic policy uncertainty as compared to traditional techniques like OLS or quantile regression. Overall, our empirical evidence suggests the existence of a negative association between equity premium and EPU predominately in all G7 countries, especially in the extreme low and extreme high tails. However, differences exist among countries and across different quantiles of EPU and the equity premium within each country. The existence of this heterogeneity among countries is due to the differences in terms of dependency on economic policy, other stock markets, and the linkages with other country's equity market.

  20. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  1. Reusable launch vehicle model uncertainties impact analysis

    NASA Astrophysics Data System (ADS)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  2. Modelling uncertainties and possible future trends of precipitation and temperature for 10 sub-basins in Columbia River Basin (CRB)

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Rana, A.; Qin, Y.; Moradkhani, H.

    2014-12-01

    Trends and changes in future climatic parameters, such as, precipitation and temperature have been a central part of climate change studies. In the present work, we have analyzed the seasonal and yearly trends and uncertainties of prediction in all the 10 sub-basins of Columbia River Basin (CRB) for future time period of 2010-2099. The work is carried out using 2 different sets of statistically downscaled Global Climate Model (GCMs) projection datasets i.e. Bias correction and statistical downscaling (BCSD) generated at Portland State University and The Multivariate Adaptive Constructed Analogs (MACA) generated at University of Idaho. The analysis is done for with 10 GCM downscaled products each from CMIP5 daily dataset totaling to 40 different downscaled products for robust analysis. Summer, winter and yearly trend analysis is performed for all the 10 sub-basins using linear regression (significance tested by student t test) and Mann Kendall test (0.05 percent significance level), for precipitation (P), temperature maximum (Tmax) and temperature minimum (Tmin). Thereafter, all the parameters are modelled for uncertainty, across all models, in all the 10 sub-basins and across the CRB for future scenario periods. Results have indicated in varied degree of trends for all the sub-basins, mostly pointing towards a significant increase in all three climatic parameters, for all the seasons and yearly considerations. Uncertainty analysis have reveled very high change in all the parameters across models and sub-basins under consideration. Basin wide uncertainty analysis is performed to corroborate results from smaller, sub-basin scale. Similar trends and uncertainties are reported on the larger scale as well. Interestingly, both trends and uncertainties are higher during winter period than during summer, contributing to large part of the yearly change.

  3. Sources of uncertanity as a basis to fill the information gap in a response to flood

    NASA Astrophysics Data System (ADS)

    Kekez, Toni; Knezic, Snjezana

    2016-04-01

    Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.

  4. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    NASA Astrophysics Data System (ADS)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  5. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  6. How multiple causes combine: independence constraints on causal inference.

    PubMed

    Liljeholm, Mimi

    2015-01-01

    According to the causal power view, two core constraints-that causes occur independently (i.e., no confounding) and influence their effects independently-serve as boundary conditions for causal induction. This study investigated how violations of these constraints modulate uncertainty about the existence and strength of a causal relationship. Participants were presented with pairs of candidate causes that were either confounded or not, and that either interacted or exerted their influences independently. Consistent with the causal power view, uncertainty about the existence and strength of causal relationships was greater when causes were confounded or interacted than when unconfounded and acting independently. An elemental Bayesian causal model captured differences in uncertainty due to confounding but not those due to an interaction. Implications of distinct sources of uncertainty for the selection of contingency information and causal generalization are discussed.

  7. Integrating climate change considerations into forest management tools and training

    Treesearch

    Linda M. Nagel; Christopher W. Swanston; Maria K. Janowiak

    2010-01-01

    Silviculturists are currently facing the challenge of developing management strategies that meet broad ecological and social considerations in spite of a high degree of uncertainty in future climatic conditions. Forest managers need state-of-the-art knowledge about climate change and potential impacts to facilitate development of silvicultural objectives and...

  8. The future of hydropower planning modeling

    NASA Astrophysics Data System (ADS)

    Haas, J.; Zuñiga, D.; Nowak, W.; Olivares, M. A.; Castelletti, A.; Thilmant, A.

    2017-12-01

    Planning the investment and operation of hydropower plants with optimization tools dates back to the 1970s. The focus used to be solely on the provision of energy. However, advances in computational capacity and solving algorithms, dynamic markets, expansion of renewable sources, and a better understanding of hydropower environmental impacts have recently led to the development of novel planning approaches. In this work, we provide a review, systematization, and trend analysis of these approaches. Further, through interviews with experts, we outline the future of hydropower planning modeling and identify the gaps towards it. We classified the found models along environmental, economic, multipurpose and technical criteria. Environmental interactions include hydropeaking mitigation, water quality protection and limiting greenhouse gas emissions from reservoirs. Economic and regulatory criteria consider uncertainties of fossil fuel prices and relicensing of water rights and power purchase agreements. Multipurpose considerations account for irrigation, tourism, flood protection and drinking water. Recently included technical details account for sedimentation in reservoirs and variable efficiencies of turbines. Additional operational considerations relate to hydrological aspects such as dynamic reservoir inflows, water losses, and climate change. Although many of the above criteria have been addressed in detail on a project-to-project basis, models remain overly simplistic for planning large power fleets. Future hydropower planning tools are expected to improve the representation of the water-energy nexus, including environmental and multipurpose criteria. Further, they will concentrate on identifying new sources of operational flexibility (e.g. through installing additional turbines and pumps) for integrating renewable energy. The operational detail will increase, potentially emphasizing variable efficiencies, storage capacity losses due to sedimentation, and the timing of inflows (which are becoming more variable under climate change). Finally, the relicensing of existing operations and planning new installations are subject to deep uncertainties that need to be captured.

  9. 21st century climate change in the European Alps--a review.

    PubMed

    Gobiet, Andreas; Kotlarski, Sven; Beniston, Martin; Heinrich, Georg; Rajczak, Jan; Stoffel, Markus

    2014-09-15

    Reliable estimates of future climate change in the Alps are relevant for large parts of the European society. At the same time, the complex Alpine region poses considerable challenges to climate models, which translate to uncertainties in the climate projections. Against this background, the present study reviews the state-of-knowledge about 21st century climate change in the Alps based on existing literature and additional analyses. In particular, it explicitly considers the reliability and uncertainty of climate projections. Results show that besides Alpine temperatures, also precipitation, global radiation, relative humidity, and closely related impacts like floods, droughts, snow cover, and natural hazards will be affected by global warming. Under the A1B emission scenario, about 0.25 °C warming per decade until the mid of the 21st century and accelerated 0.36 °C warming per decade in the second half of the century is expected. Warming will probably be associated with changes in the seasonality of precipitation, global radiation, and relative humidity, and more intense precipitation extremes and flooding potential in the colder part of the year. The conditions of currently record breaking warm or hot winter or summer seasons, respectively, may become normal at the end of the 21st century, and there is indication for droughts to become more severe in the future. Snow cover is expected to drastically decrease below 1500-2000 m and natural hazards related to glacier and permafrost retreat are expected to become more frequent. Such changes in climatic parameters and related quantities will have considerable impact on ecosystems and society and will challenge their adaptive capabilities. © 2013. Published by Elsevier B.V. All rights reserved.

  10. Mapping total suspended matter from geostationary satellites: a feasibility study with SEVIRI in the Southern North Sea.

    PubMed

    Neukermans, Griet; Ruddick, Kevin; Bernard, Emilien; Ramon, Didier; Nechad, Bouchra; Deschamps, Pierre-Yves

    2009-08-03

    Geostationary ocean colour sensors have not yet been launched into space, but are under consideration by a number of space agencies. This study provides a proof of concept for mapping of Total Suspended Matter (TSM) in turbid coastal waters from geostationary platforms with the existing SEVIRI (Spinning Enhanced Visible and InfraRed Imager) meteorological sensor on the METEOSAT Second Generation platform. Data are available in near real time every 15 minutes. SEVIRI lacks sufficient bands for chlorophyll remote sensing but its spectral resolution is sufficient for quantification of Total Suspended Matter (TSM) in turbid waters, using a single broad red band, combined with a suitable near infrared band. A test data set for mapping of TSM in the Southern North Sea was obtained covering 35 consecutive days from June 28 until July 31 2006. Atmospheric correction of SEVIRI images includes corrections for Rayleigh and aerosol scattering, absorption by atmospheric gases and atmospheric transmittances. The aerosol correction uses assumptions on the ratio of marine reflectances and aerosol reflectances in the red and near-infrared bands. A single band TSM retrieval algorithm, calibrated by non-linear regression of seaborne measurements of TSM and marine reflectance was applied. The effect of the above assumptions on the uncertainty of the marine reflectance and TSM products was analysed. Results show that (1) mapping of TSM in the Southern North Sea is feasible with SEVIRI for turbid waters, though with considerable uncertainties in clearer waters, (2) TSM maps are well correlated with TSM maps obtained from MODIS AQUA and (3) during cloud-free days, high frequency dynamics of TSM are detected.

  11. Characterization of a neutron sensitive MCP/Timepix detector for quantitative image analysis at a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.

    2017-07-01

    The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.

  12. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  13. Breastfeeding considerations of opioid dependent mothers and infants.

    PubMed

    Hilton, Tara C

    2012-01-01

    The American Academy of Pediatrics (AAP) has a long-standing recommendation against breastfeeding if the maternal methadone dose is above 20 mg/day. In 2001, the AAP lifted the dose restriction of maternal methadone allowing methadone-maintained mothers to breastfeed. The allowance of breastfeeding among mothers taking methadone has been met with opposition due to the uncertainty that exists related to methadone exposure of the suckling infant. Methadone-maintained mothers are at higher risk for abuse, concomitant psychiatric disorders, limited access to healthcare, and financial hardship. Breastfeeding rates among methadone-maintained women tend to be low compared to the national average. This manuscript will discuss the implications for healthcare practitioners caring for methadone-maintained mothers and infants and associated risks and benefits of breastfeeding. This population of mothers and infants stands to obtain particular benefits from the various well-known advantages of breastfeeding.

  14. Planning and setting objectives in field studies: Chapter 2

    USGS Publications Warehouse

    Fisher, Robert N.; Dodd, C. Kenneth

    2016-01-01

    This chapter enumerates the steps required in designing and planning field studies on the ecology and conservation of reptiles, as these involve a high level of uncertainty and risk. To this end, the chapter differentiates between goals (descriptions of what one intends to accomplish) and objectives (the measurable steps required to achieve the established goals). Thus, meeting a specific goal may require many objectives. It may not be possible to define some of them until certain experiments have been conducted; often evaluations of sampling protocols are needed to increase certainty in the biological results. And if sampling locations are fixed and sampling events are repeated over time, then both study-specific covariates and sampling-specific covariates should exist. Additionally, other critical design considerations for field study include obtaining permits, as well as researching ethics and biosecurity issues.

  15. Impact of Pitot tube calibration on the uncertainty of water flow rate measurement

    NASA Astrophysics Data System (ADS)

    de Oliveira Buscarini, Icaro; Costa Barsaglini, Andre; Saiz Jabardo, Paulo Jose; Massami Taira, Nilson; Nader, Gilder

    2015-10-01

    Water utility companies often use Cole type Pitot tubes to map velocity profiles and thus measure flow rate. Frequent monitoring and measurement of flow rate is an important step in identifying leaks and other types of losses. In Brazil losses as high as 42% are common and in some places even higher values are found. When using Cole type Pitot tubes to measure the flow rate, the uncertainty of the calibration coefficient (Cd) is a major component of the overall flow rate measurement uncertainty. A common practice is to employ the usual value Cd = 0.869, in use since Cole proposed his Pitot tube in 1896. Analysis of 414 calibrations of Cole type Pitot tubes show that Cd varies considerably and values as high 0.020 for the expanded uncertainty are common. Combined with other uncertainty sources, the overall velocity measurement uncertainty is 0.02, increasing flowrate measurement uncertainty by 1.5% which, for the Sao Paulo metropolitan area (Brazil) corresponds to 3.5 × 107 m3/year.

  16. Uncertainty in the delayed neutron fraction in fuel assembly depletion calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Kodeli, Ivan A.; Velkov, Kiril; Zwermann, Winfried

    2017-09-01

    This study presents uncertainty and sensitivity analyses of the delayed neutron fraction of light water reactor and sodium-cooled fast reactor fuel assemblies. For these analyses, the sampling-based XSUSA methodology is used to propagate cross section uncertainties in neutron transport and depletion calculations. Cross section data is varied according to the SCALE 6.1 covariance library. Since this library includes nu-bar uncertainties only for the total values, it has been supplemented by delayed nu-bar uncertainties from the covariance data of the JENDL-4.0 nuclear data library. The neutron transport and depletion calculations are performed with the TRITON/NEWT sequence of the SCALE 6.1 package. The evolution of the delayed neutron fraction uncertainty over burn-up is analysed without and with the consideration of delayed nu-bar uncertainties. Moreover, the main contributors to the result uncertainty are determined. In all cases, the delayed nu-bar uncertainties increase the delayed neutron fraction uncertainty. Depending on the fuel composition, the delayed nu-bar values of uranium and plutonium in fact give the main contributions to the delayed neutron fraction uncertainty for the LWR fuel assemblies. For the SFR case, the uncertainty of the scattering cross section of U-238 is the main contributor.

  17. Stress Corrosion of Ceramic Materials

    DTIC Science & Technology

    1981-10-01

    stresses are liable to fail after an indeterminate period of time, leading to a considerable uncertainty in the safe design stress. One of the objectives...of modern ceramics technology is to reduce the uncertainty associated with structural design , and hence, to improve our capabilities of designing ...processes that occur during stress corrosion cracking. Recent advances in th~earea of structural design with ceramic materials have lead to several

  18. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.

  19. Modeling Cervical Cancer Prevention in Developed Countries

    PubMed Central

    Kim, Jane J.; Brisson, Marc; Edmunds, W. John; Goldie, Sue J.

    2009-01-01

    Cytology-based screening has reduced cervical cancer mortality in countries able to implement, sustain and financially support organized programs that achieve broad coverage. These ongoing secondary prevention efforts considerably complicate the question of whether vaccination against Human Papillomavirus (HPV) types -16 and 18 should be introduced. Policy questions focus primarily on the target ages of vaccination, appropriate ages for a temporary “catch-up” program, possible revisions in screening policies to optimize synergies with vaccination, including the increased used of HPV DNA testing, and the inclusion of boys in the vaccination program. Decision-analytic models are increasingly being developed to simulate disease burden and interventions in different settings in order to evaluate the benefits and cost-effectiveness of primary and secondary interventions for informed decision-making. This article is a focused review on existing mathematical models that have been used to evaluate HPV vaccination in the context of developed countries with existing screening programs. Despite variations in model assumptions and uncertainty in existing data, pre-adolescent vaccination of girls is consistently found to be attractive in the context of current screening practices, provided there is complete and lifelong vaccine protection and widespread vaccination coverage. Questions related to catch-up vaccination programs, potential benefits of other non-cervical cancer outcomes and inclusion of boys are subject to far more uncertainty, and results from these analyses have reached conflicting conclusions. Most analyses find that some catch-up vaccination is warranted but becomes increasingly unattractive as the catch-up age is extended, and vaccination of boys is unlikely to be cost-effective if reasonable levels of coverage are achieved in girls or coverage among girls can be improved. The objective of the review is to highlight points of consensus and qualitative themes, to discuss the areas of divergent findings, and to provide insight into critical decisions related to cervical cancer prevention. PMID:18847560

  20. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  1. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that assessment [IPCC, 2005]. This paper extends a review of the treatment of uncertainty in the IPCC assessments by Swart et al [2009]. It is shown that progress towards consistency has been made but that there also appears to be a need for continued use of several complementary approaches in order to cover the wide range of circumstances across different disciplines involved in climate change. While this reflects the situation in the science community, it also raises the level of complexity for policymakers and other users of the assessments who would prefer one common consensus approach. References IPCC (2005), Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties, IPCC, Geneva. Manning, M., et al. (2004), IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and of Options. IPCC Moss, R., and S. Schneider (2000), Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third Assessment Report of the IPCC, edited by R. Pachauri, et al., Intergovernmental Panel on Climate Change (IPCC), Geneva. Swart, R., et al. (2009), Agreeing to disagree: uncertainty management in assessing climate change, impacts and responses by the IPCC Climatic Change, 92(1-2), 1 - 29.

  2. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  3. Projecting Heat-Related Mortality Impacts Under a Changing Climate in the New York City Region

    PubMed Central

    Knowlton, Kim; Lynn, Barry; Goldberg, Richard A.; Rosenzweig, Cynthia; Hogrefe, Christian; Rosenthal, Joyce Klein; Kinney, Patrick L.

    2007-01-01

    Objectives. We sought to project future impacts of climate change on summer heat-related premature deaths in the New York City metropolitan region. Methods. Current and future climates were simulated over the northeastern United States with a global-to-regional climate modeling system. Summer heat-related premature deaths in the 1990s and 2050s were estimated by using a range of scenarios and approaches to modeling acclimatization (e.g., increased use of air conditioning, gradual physiological adaptation). Results. Projected regional increases in heat-related premature mortality by the 2050s ranged from 47% to 95%, with a mean 70% increase compared with the 1990s. Acclimatization effects reduced regional increases in summer heat-related premature mortality by about 25%. Local impacts varied considerably across the region, with urban counties showing greater numbers of deaths and smaller percentage increases than less-urbanized counties. Conclusions. Although considerable uncertainty exists in climate forecasts and future health vulnerability, the range of projections we developed suggests that by midcentury, acclimatization may not completely mitigate the effects of climate change in the New York City metropolitan region, which would result in an overall net increase in heat-related premature mortality. PMID:17901433

  4. The atmospheric effects of stratospheric aircraft: A third program report

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S. (Editor); Wesoky, Howard L. (Editor)

    1993-01-01

    A third report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High-Speed Research Program (HSRP) is presented. Market and technology considerations continue to provide an impetus for high-speed civil transport research. A recent United Nations Environment Program scientific assessment showed that considerable uncertainty still exists about the possible impact of aircraft on the atmosphere. The AESA was designed to develop the body of scientific knowledge necessary for the evaluation of the impact of stratospheric aircraft on the atmosphere. The first Program report presented the basic objectives and plans for AESA. This third report marks the midpoint of the program and presents the status of the ongoing research on the impact of stratospheric aircraft on the atmosphere as reported at the third annual AESA Program meeting in June 1993. The focus of the program is on predicted atmospheric changes resulting from projected HSCT emissions. Topics reported on cover how high-speed civil transports (HSCT) might affect stratospheric ozone, emissions scenarios and databases to assess potential atmospheric effects from HSCT's, calculated results from 2-D zonal mean models using emissions data, engine trace constituent measurements, and exhaust plume/aircraft wake vortex interactions.

  5. The atmospheric effects of stratospheric aircraft

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S. (Editor); Wesoky, Howard L. (Editor)

    1993-01-01

    This document presents a second report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High-Speed Research Program (HSRP). This document presents a second report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High Speed Research Program (HSRP). Market and technology considerations continue to provide an impetus for high-speed civil transport research. A recent United Nations Environment Program scientific assessment has shown that considerable uncertainty still exists about the possible impact of aircraft on the atmosphere. The AESA was designed to develop the body of scientific knowledge necessary for the evaluation of the impact of stratospheric aircraft on the atmosphere. The first Program report presented the basic objectives and plans for AESA. This second report presents the status of the ongoing research as reported by the principal investigators at the second annual AESA Program meeting in May 1992: Laboratory studies are probing the mechanism responsible for many of the heterogeneous reactions that occur on stratospheric particles. Understanding how the atmosphere redistributes aircraft exhaust is critical to our knowing where the perturbed air will go and for how long it will remain in the stratosphere. The assessment of fleet effects is dependent on the ability to develop scenarios which correctly simulate fleet operations.

  6. Active learning for clinical text classification: is it better than random sampling?

    PubMed

    Figueroa, Rosa L; Zeng-Treitler, Qing; Ngo, Long H; Goryachev, Sergey; Wiechmann, Eduardo P

    2012-01-01

    This study explores active learning algorithms as a way to reduce the requirements for large training sets in medical text classification tasks. Three existing active learning algorithms (distance-based (DIST), diversity-based (DIV), and a combination of both (CMB)) were used to classify text from five datasets. The performance of these algorithms was compared to that of passive learning on the five datasets. We then conducted a novel investigation of the interaction between dataset characteristics and the performance results. Classification accuracy and area under receiver operating characteristics (ROC) curves for each algorithm at different sample sizes were generated. The performance of active learning algorithms was compared with that of passive learning using a weighted mean of paired differences. To determine why the performance varies on different datasets, we measured the diversity and uncertainty of each dataset using relative entropy and correlated the results with the performance differences. The DIST and CMB algorithms performed better than passive learning. With a statistical significance level set at 0.05, DIST outperformed passive learning in all five datasets, while CMB was found to be better than passive learning in four datasets. We found strong correlations between the dataset diversity and the DIV performance, as well as the dataset uncertainty and the performance of the DIST algorithm. For medical text classification, appropriate active learning algorithms can yield performance comparable to that of passive learning with considerably smaller training sets. In particular, our results suggest that DIV performs better on data with higher diversity and DIST on data with lower uncertainty.

  7. Adaptive fuzzy-neural-network control for maglev transportation system.

    PubMed

    Wai, Rong-Jong; Lee, Jeng-Dao

    2008-01-01

    A magnetic-levitation (maglev) transportation system including levitation and propulsion control is a subject of considerable scientific interest because of highly nonlinear and unstable behaviors. In this paper, the dynamic model of a maglev transportation system including levitated electromagnets and a propulsive linear induction motor (LIM) based on the concepts of mechanical geometry and motion dynamics is developed first. Then, a model-based sliding-mode control (SMC) strategy is introduced. In order to alleviate chattering phenomena caused by the inappropriate selection of uncertainty bound, a simple bound estimation algorithm is embedded in the SMC strategy to form an adaptive sliding-mode control (ASMC) scheme. However, this estimation algorithm is always a positive value so that tracking errors introduced by any uncertainty will cause the estimated bound increase even to infinity with time. Therefore, it further designs an adaptive fuzzy-neural-network control (AFNNC) scheme by imitating the SMC strategy for the maglev transportation system. In the model-free AFNNC, online learning algorithms are designed to cope with the problem of chattering phenomena caused by the sign action in SMC design, and to ensure the stability of the controlled system without the requirement of auxiliary compensated controllers despite the existence of uncertainties. The outputs of the AFNNC scheme can be directly supplied to the electromagnets and LIM without complicated control transformations for relaxing strict constrains in conventional model-based control methodologies. The effectiveness of the proposed control schemes for the maglev transportation system is verified by numerical simulations, and the superiority of the AFNNC scheme is indicated in comparison with the SMC and ASMC strategies.

  8. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    PubMed

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Active learning for clinical text classification: is it better than random sampling?

    PubMed Central

    Figueroa, Rosa L; Ngo, Long H; Goryachev, Sergey; Wiechmann, Eduardo P

    2012-01-01

    Objective This study explores active learning algorithms as a way to reduce the requirements for large training sets in medical text classification tasks. Design Three existing active learning algorithms (distance-based (DIST), diversity-based (DIV), and a combination of both (CMB)) were used to classify text from five datasets. The performance of these algorithms was compared to that of passive learning on the five datasets. We then conducted a novel investigation of the interaction between dataset characteristics and the performance results. Measurements Classification accuracy and area under receiver operating characteristics (ROC) curves for each algorithm at different sample sizes were generated. The performance of active learning algorithms was compared with that of passive learning using a weighted mean of paired differences. To determine why the performance varies on different datasets, we measured the diversity and uncertainty of each dataset using relative entropy and correlated the results with the performance differences. Results The DIST and CMB algorithms performed better than passive learning. With a statistical significance level set at 0.05, DIST outperformed passive learning in all five datasets, while CMB was found to be better than passive learning in four datasets. We found strong correlations between the dataset diversity and the DIV performance, as well as the dataset uncertainty and the performance of the DIST algorithm. Conclusion For medical text classification, appropriate active learning algorithms can yield performance comparable to that of passive learning with considerably smaller training sets. In particular, our results suggest that DIV performs better on data with higher diversity and DIST on data with lower uncertainty. PMID:22707743

  10. pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    White, J.; Brakefield, L. K.

    2015-12-01

    The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.

  11. Investigation of Multi-Input Multi-Output Robust Control Methods to Handle Parametric Uncertainties in Autopilot Design.

    PubMed

    Kasnakoğlu, Coşku

    2016-01-01

    Some level of uncertainty is unavoidable in acquiring the mass, geometry parameters and stability derivatives of an aerial vehicle. In certain instances tiny perturbations of these could potentially cause considerable variations in flight characteristics. This research considers the impact of varying these parameters altogether. This is a generalization of examining the effects of particular parameters on selected modes present in existing literature. Conventional autopilot designs commonly assume that each flight channel is independent and develop single-input single-output (SISO) controllers for every one, that are utilized in parallel for actual flight. It is demonstrated that an attitude controller built like this can function flawlessly on separate nominal cases, but can become unstable with a perturbation no more than 2%. Two robust multi-input multi-output (MIMO) design strategies, specifically loop-shaping and μ-synthesis are outlined as potential substitutes and are observed to handle large parametric changes of 30% while preserving decent performance. Duplicating the loop-shaping procedure for the outer loop, a complete flight control system is formed. It is confirmed through software-in-the-loop (SIL) verifications utilizing blade element theory (BET) that the autopilot is capable of navigation and landing exposed to high parametric variations and powerful winds.

  12. Investigation of Multi-Input Multi-Output Robust Control Methods to Handle Parametric Uncertainties in Autopilot Design

    PubMed Central

    Kasnakoğlu, Coşku

    2016-01-01

    Some level of uncertainty is unavoidable in acquiring the mass, geometry parameters and stability derivatives of an aerial vehicle. In certain instances tiny perturbations of these could potentially cause considerable variations in flight characteristics. This research considers the impact of varying these parameters altogether. This is a generalization of examining the effects of particular parameters on selected modes present in existing literature. Conventional autopilot designs commonly assume that each flight channel is independent and develop single-input single-output (SISO) controllers for every one, that are utilized in parallel for actual flight. It is demonstrated that an attitude controller built like this can function flawlessly on separate nominal cases, but can become unstable with a perturbation no more than 2%. Two robust multi-input multi-output (MIMO) design strategies, specifically loop-shaping and μ-synthesis are outlined as potential substitutes and are observed to handle large parametric changes of 30% while preserving decent performance. Duplicating the loop-shaping procedure for the outer loop, a complete flight control system is formed. It is confirmed through software-in-the-loop (SIL) verifications utilizing blade element theory (BET) that the autopilot is capable of navigation and landing exposed to high parametric variations and powerful winds. PMID:27783706

  13. Role of future scenarios in understanding deep uncertainty in ...

    EPA Pesticide Factsheets

    The environment and its interactions with human systems, whether economic, social or political, are complex. Relevant drivers may disrupt system dynamics in unforeseen ways, making it difficult to predict future conditions. This kind of deep uncertainty presents a challenge to organizations faced with making decisions about the future, including those involved in air quality management. Scenario Planning is a structured process that involves the development of narratives describing alternative future states of the world, designed to differ with respect to the most critical and uncertain drivers. The resulting scenarios are then used to understand the consequences of those futures and to prepare for them with robust management strategies. We demonstrate a novel air quality management application of Scenario Planning. Through a series of workshops, important air quality drivers were identified. The most critical and uncertain drivers were found to be “technological development” and “change in societal paradigms.” These drivers were used as a basis to develop four distinct scenario storylines. The energy and emission implications of each storyline were then modeled using the MARKAL energy system model. NOX and SO2 emissions were found to decrease for all scenarios, largely a response to existing air quality regulations. Future-year emissions differed considerably from one scenario to another, however, with key differentiating factors being transition

  14. Applying Aggregate Exposure Pathway and Adverse Outcome ...

    EPA Pesticide Factsheets

    Hazard assessment for nanomaterials often involves applying in vitro dose-response data to estimate potential health risks that arise from exposure to products that contain nanomaterials. However, much uncertainty is inherent in relating bioactivities observed in an in vitro system to the perturbations of biological mechanisms that lead to apical adverse health outcomes in living organisms. The Adverse Outcome Pathway (AOP) framework addresses this uncertainty by acting as a scaffold onto which in vitro toxicity testing and other data can be arranged to aid in the interpretation of these results in terms of biologically-relevant responses, as an AOP connects an upstream molecular initiating event (MIE) to a downstream adverse outcome. In addition to hazard assessment, risk estimation also requires reconciling in vitro concentrations sufficient to produce bioactivity with in vivo concentrations that can trigger a MIE at the relevant biological target. Such target site exposures (TSEs) can be estimated by integrating pharmacokinetic considerations with environmental and exposure factors. Environmental and exposure data have been historically scattered in various resources, such as monitoring data for air pollutants or exposure models for specific chemicals. The Aggregate Exposure Pathway (AEP) framework is introduced to organize existing knowledge concerning biologically, chemically, and physically plausible, as well as empirically supported, links between the i

  15. The time-delay signature of quark-gluon plasma formation in relativistic nuclear collisions

    NASA Astrophysics Data System (ADS)

    Rischke, Dirk H.; Gyulassy, Miklos

    1996-02-01

    The hydrodynamic expansion of quark-gluon plasmas with spherical and longitudinally boost-invariant geometries is studied as a function of the initial energy density. The sensitivity of the collective flow pattern to uncertainties in the nuclear matter equation of state is explored. We concentrate on the effect of a possible finite width, ΔT ˜ 0.1 Tc, of the transition region between quark-gluon plasma and hadronic phase. Although slow deflagration solutions that act to stall the expansion do not exist for ΔT > 0.08 Tc, we find, nevertheless, that the equation of state remains sufficiently soft in the transition region to delay the propagation of ordinary rarefaction waves for a considerable time. We compute the dependence of the pion-interferometry correlation function on ΔT, since this is the most promising observable for time-delayed expansion. The signature of time delay, proposed by Pratt and Bertsch, is an enhancement of the ratio of the inverse width of the pion correlation function in out-direction to that in side-direction. One of our main results is that this generic signature of quark-gluon plasma formation is rather robust to the uncertainties in the width of the transition region. Furthermore, for longitudinal boost-invariant geometries, the signal is likely to be maximized around RHIC energies

  16. An Enhanced Adaptive Management Approach for Remediation of Legacy Mercury in the South River

    PubMed Central

    Foran, Christy M.; Baker, Kelsie M.; Grosso, Nancy R.; Linkov, Igor

    2015-01-01

    Uncertainties about future conditions and the effects of chosen actions, as well as increasing resource scarcity, have been driving forces in the utilization of adaptive management strategies. However, many applications of adaptive management have been criticized for a number of shortcomings, including a limited ability to learn from actions and a lack of consideration of stakeholder objectives. To address these criticisms, we supplement existing adaptive management approaches with a decision-analytical approach that first informs the initial selection of management alternatives and then allows for periodic re-evaluation or phased implementation of management alternatives based on monitoring information and incorporation of stakeholder values. We describe the application of this enhanced adaptive management (EAM) framework to compare remedial alternatives for mercury in the South River, based on an understanding of the loading and behavior of mercury in the South River near Waynesboro, VA. The outcomes show that the ranking of remedial alternatives is influenced by uncertainty in the mercury loading model, by the relative importance placed on different criteria, and by cost estimates. The process itself demonstrates that a decision model can link project performance criteria, decision-maker preferences, environmental models, and short- and long-term monitoring information with management choices to help shape a remediation approach that provides useful information for adaptive, incremental implementation. PMID:25665032

  17. Pore fluids and the LGM ocean salinity-Reconsidered

    NASA Astrophysics Data System (ADS)

    Wunsch, Carl

    2016-03-01

    Pore fluid chlorinity/salinity data from deep-sea cores related to the salinity maximum of the last glacial maximum (LGM) are analyzed using estimation methods deriving from linear control theory. With conventional diffusion coefficient values and no vertical advection, results show a very strong dependence upon initial conditions at -100 ky. Earlier inferences that the abyssal Southern Ocean was strongly salt-stratified in the LGM with a relatively fresh North Atlantic Ocean are found to be consistent within uncertainties of the salinity determination, which remain of order ±1 g/kg. However, an LGM Southern Ocean abyss with an important relative excess of salt is an assumption, one not required by existing core data. None of the present results show statistically significant abyssal salinity values above the global average, and results remain consistent, apart from a general increase owing to diminished sea level, with a more conventional salinity distribution having deep values lower than the global mean. The Southern Ocean core does show a higher salinity than the North Atlantic one on the Bermuda Rise at different water depths. Although much more sophisticated models of the pore-fluid salinity can be used, they will only increase the resulting uncertainties, unless considerably more data can be obtained. Results are consistent with complex regional variations in abyssal salinity during deglaciation, but none are statistically significant.

  18. A Review of Flood Loss Models as Basis for Harmonization and Benchmarking

    PubMed Central

    Kreibich, Heidi; Franco, Guillermo; Marechal, David

    2016-01-01

    Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss–or flood vulnerability–relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework. PMID:27454604

  19. A Review of Flood Loss Models as Basis for Harmonization and Benchmarking.

    PubMed

    Gerl, Tina; Kreibich, Heidi; Franco, Guillermo; Marechal, David; Schröter, Kai

    2016-01-01

    Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss-or flood vulnerability-relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework.

  20. Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System

    DTIC Science & Technology

    2010-09-13

    model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of

  1. Multidecadal Scale Detection Time for Potentially Increasing Atlantic Storm Surges in a Warming Climate

    NASA Astrophysics Data System (ADS)

    Lee, Benjamin Seiyon; Haran, Murali; Keller, Klaus

    2017-10-01

    Storm surges are key drivers of coastal flooding, which generate considerable risks. Strategies to manage these risks can hinge on the ability to (i) project the return periods of extreme storm surges and (ii) detect potential changes in their statistical properties. There are several lines of evidence linking rising global average temperatures and increasingly frequent extreme storm surges. This conclusion is, however, subject to considerable structural uncertainty. This leads to two main questions: What are projections under various plausible statistical models? How long would it take to distinguish among these plausible statistical models? We address these questions by analyzing observed and simulated storm surge data. We find that (1) there is a positive correlation between global mean temperature rise and increasing frequencies of extreme storm surges; (2) there is considerable uncertainty underlying the strength of this relationship; and (3) if the frequency of storm surges is increasing, this increase can be detected within a multidecadal timescale (≈20 years from now).

  2. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  3. Estimation statistique de donnees manquantes en inventaire du cycle de vie

    NASA Astrophysics Data System (ADS)

    Moreau, Vincent

    The main objective of the research work is to improve the quality of life cycle inventory data by developing a method to estimate missing data and corresponding uncertainties. Contrary to process-based models of mass and energy balance, this approach consists of statistical estimators which model processes from relatively small samples of usually high variability. The research hypothesis is as follows: the so called kriging estimator allows the combined estimation of missing data and their uncertainties in such ways that are more reliable than other linear estimators. Borrowed from spatial statistics, kriging is an estimator with several advantages, the flexibility associated with a choice of model function and the exact estimator property. In other words, kriging shows no statistical errors when estimating observed values, no data is averaged out. An interpretation of the kriging parameters specific to the problems of data uncertainty, offers more advantages. One parameter of the covariance function accounts for small scale variations of the data and taken as a proxy for uncertainty. Whether it be the variety of data sources, the scarcity of the data itself or both, each and every source adds to data variability and uncertainty. The kriging system of equations is therefore modified such as to integrate a factor of uncertainty specific to each observations. Comparisons between the modified and conventional forms of kriging can be drawn. The procedure is based on the relationship between technical specifications, more readily available independent variables, and the dependent material and energy flows of the processes under consideration. Such material and energy requirements as well as emissions are estimated over the entire life cycle of products and processes. The needs for additional data are relatively low compared to other approaches, namely extended input output analysis. For many products, processes and services, electricity generation and consumption account for a sizable share of the impacts. Hydroelectricity in particular is poorly represented within existing inventory data since production facilities vary considerably from one location to another. In other words, generic hydropower plants do not exist. Contrary to inventory flows, the technical specifications or characteristic variables of hydropower plants, such as the installed capacity, annual production or surface area of adjacent reservoirs, are usually publicly available. The kriging model is first tested on a data set which represents windmills of varying power capacity before it is applied to hydroelectricity. The experiment is divided according to data availability, on one hand the energy and materials required during construction, operation and maintenance of hydropower plants. The results show that estimation of inventory data can be improved thanks to kriging. When comparing different forms of kriging and linear regression, the kriging estimates are not only more precise but the standard deviations also cover the data more accurately. Where the observed data are incomplete, that is where inventory flows are missing for part of the observations, the estimation errors are lower for kriging than linear regression. Moreover, univariate kriging of inventory flows based on two characteristic variables, shows lower errors than its multivariate kin, cokriging. On average the statistical errors calculated from cross-validation are lower for kriging than they are for linear regression, whether the observed data are complete or not. The application of several characteristic variables improves the quality of the estimates when they are positively correlated. In addition, the modified form of kriging which accounts for degrees of uncertainty specific to each observations, results in a reduction in the variations of the estimated inventory data. That is, data variability is incorporated directly in the model. Estimates closer to more reliable observations are shown to be less uncertain and vice versa. For each of the data sets, different relationships between dependent and independent variables are tested, for example the linear, exponential, spherical and cubic covariance functions as well as a range of parameter values. For the analysis of electrical generation technologies, these results imply better estimates for data that are difficult to sample and therefore a simplified data collection process. In the case of site specific or variable processes such as hydroelectricity, the estimation of inventory data with kriging accounting for such data variability, proves more representative of the geographical or technological context. The quality of inventory data is consequently higher. Even if kriging has several advantages and its estimation errors are lower on average, some limitations to its application exist. (Abstract shortened by UMI.).

  4. Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma

    2010-01-01

    In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.

  5. Uncertainty representation of grey numbers and grey sets.

    PubMed

    Yang, Yingjie; Liu, Sifeng; John, Robert

    2014-09-01

    In the literature, there is a presumption that a grey set and an interval-valued fuzzy set are equivalent. This presumption ignores the existence of discrete components in a grey number. In this paper, new measurements of uncertainties of grey numbers and grey sets, consisting of both absolute and relative uncertainties, are defined to give a comprehensive representation of uncertainties in a grey number and a grey set. Some simple examples are provided to illustrate that the proposed uncertainty measurement can give an effective representation of both absolute and relative uncertainties in a grey number and a grey set. The relationships between grey sets and interval-valued fuzzy sets are also analyzed from the point of view of the proposed uncertainty representation. The analysis demonstrates that grey sets and interval-valued fuzzy sets provide different but overlapping models for uncertainty representation in sets.

  6. Tracking the global generation and exports of e-waste. Do existing estimates add up?

    PubMed

    Breivik, Knut; Armitage, James M; Wania, Frank; Jones, Kevin C

    2014-01-01

    The transport of discarded electronic and electrical appliances (e-waste) to developing regions has received considerable attention, but it is difficult to assess the significance of this issue without a quantitative understanding of the amounts involved. The main objective of this study is to track the global transport of e-wastes by compiling and constraining existing estimates of the amount of e-waste generated domestically in each country MGEN, exported from countries belonging to the Organization for Economic Cooperation and Development (OECD) MEXP, and imported in countries outside of the OECD MIMP. Reference year is 2005 and all estimates are given with an uncertainty range. Estimates of MGEN obtained by apportioning a global total of ∼ 35,000 kt (range 20,000-50,000 kt) based on a nation's gross domestic product agree well with independent estimates of MGEN for individual countries. Import estimates MIMP to the countries believed to be the major recipients of e-waste exports from the OECD globally (China, India, and five West African countries) suggests that ∼ 5,000 kt (3,600 kt-7,300 kt) may have been imported annually to these non-OECD countries alone, which represents ∼ 23% (17%-34%) of the amounts of e-waste generated domestically within the OECD. MEXP for each OECD country is then estimated by applying this fraction of 23% to its MGEN. By allocating each country's MGEN, MIMP, MEXP and MNET = MGEN + MIMP - MEXP, we can map the global generation and flows of e-waste from OECD to non-OECD countries. While significant uncertainties remain, we note that estimated import into seven non-OECD countries alone are often at the higher end of estimates of exports from OECD countries.

  7. Accuracy assessment for a multi-parameter optical calliper in on line automotive applications

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2017-08-01

    In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.

  8. Dynamics of entanglement and uncertainty relation in coupled harmonic oscillator system: exact results

    NASA Astrophysics Data System (ADS)

    Park, DaeKil

    2018-06-01

    The dynamics of entanglement and uncertainty relation is explored by solving the time-dependent Schrödinger equation for coupled harmonic oscillator system analytically when the angular frequencies and coupling constant are arbitrarily time dependent. We derive the spectral and Schmidt decompositions for vacuum solution. Using the decompositions, we derive the analytical expressions for von Neumann and Rényi entropies. Making use of Wigner distribution function defined in phase space, we derive the time dependence of position-momentum uncertainty relations. To show the dynamics of entanglement and uncertainty relation graphically, we introduce two toy models and one realistic quenched model. While the dynamics can be conjectured by simple consideration in the toy models, the dynamics in the realistic quenched model is somewhat different from that in the toy models. In particular, the dynamics of entanglement exhibits similar pattern to dynamics of uncertainty parameter in the realistic quenched model.

  9. Nonlocal quantum macroscopic superposition in a high-thermal low-purity state

    PubMed Central

    Brezinski, Mark E.; Liu, Bin

    2013-01-01

    Quantum state exchange between light and matter is an important ingredient for future quantum information networks as well as other applications. Photons are the fastest and simplest carriers of information for transmission but in general, it is difficult to localize and store photons, so usually one prefers choosing matter as quantum memory elements. Macroscopic superposition and nonlocal quantum interactions have received considerable interest for this purpose over recent years in fields ranging from quantum computers to cryptography, in addition to providing major insights into physical laws. However, these experiments are generally performed either with equipment or under conditions that are unrealistic for practical applications. Ideally, the two can be combined using conventional equipment and conditions to generate a “quantum teleportation”-like state, particularly with a very small amount of purity existing in an overall highly mixed thermal state (relatively low decoherence at high temperatures). In this study we used an experimental design to demonstrate these principles. We performed optical coherence tomography (OCT) using a thermal source at room temperatures of a specifically designed target in the sample arm. Here, position uncertainty (i.e., dispersion) was induced in the reference arm. In the sample arm (target) we placed two glass plates separated by a different medium while altering position uncertainty in the reference arm. This resulted in a chirped signal between the glass plate reflective surfaces in the combined interferogram. The chirping frequency, as measured by the fast Fourier transform (FFT), varies with the medium between the plates, which is a nonclassical phenomenon. These results are statistically significant and occur from a superposition between the glass surface and the medium with increasing position uncertainty, a true quantum-mechanical phenomenon produced by photon pressure from two-photon interference. The differences in chirping frequency with medium disappears when second-order correlations are removed by dual balanced detection, confirming the proposed mechanism. We demonstrated that increasing position uncertainty at one site leads to position uncertainty (quantum position probability amplitude) nonlocally via second-order correlations (two-photon probability amplitude) from a low coherence thermal source (low purity, high local entropy). The implications, first, are that the phenomenon cannot be explained through classical mechanisms but can be explained within the context of quantum mechanics, particularly relevant to the second-order correlations where controversy exists. More specifically, we provide the theoretical framework that these results indicate a nonlocal macroscopic superposition is occurring through a two-photon probability amplitude-induced increase in the target position probability amplitude uncertainty. In addition, as the experiments were performed with a classical source at room temperature, it supports both the quantum-mechanical properties of second-order correlations and that macroscopic superposition is obtainable in a target not in a single coherent state (mixed state). Future work will focus on generalizing the observations outside the current experimental design and creating embodiments that allow practical application of the phenomenon. PMID:24204102

  10. Nonlocal quantum macroscopic superposition in a high-thermal low-purity state.

    PubMed

    Brezinski, Mark E; Liu, Bin

    2008-12-16

    Quantum state exchange between light and matter is an important ingredient for future quantum information networks as well as other applications. Photons are the fastest and simplest carriers of information for transmission but in general, it is difficult to localize and store photons, so usually one prefers choosing matter as quantum memory elements. Macroscopic superposition and nonlocal quantum interactions have received considerable interest for this purpose over recent years in fields ranging from quantum computers to cryptography, in addition to providing major insights into physical laws. However, these experiments are generally performed either with equipment or under conditions that are unrealistic for practical applications. Ideally, the two can be combined using conventional equipment and conditions to generate a "quantum teleportation"-like state, particularly with a very small amount of purity existing in an overall highly mixed thermal state (relatively low decoherence at high temperatures). In this study we used an experimental design to demonstrate these principles. We performed optical coherence tomography (OCT) using a thermal source at room temperatures of a specifically designed target in the sample arm. Here, position uncertainty (i.e., dispersion) was induced in the reference arm. In the sample arm (target) we placed two glass plates separated by a different medium while altering position uncertainty in the reference arm. This resulted in a chirped signal between the glass plate reflective surfaces in the combined interferogram. The chirping frequency, as measured by the fast Fourier transform (FFT), varies with the medium between the plates, which is a nonclassical phenomenon. These results are statistically significant and occur from a superposition between the glass surface and the medium with increasing position uncertainty, a true quantum-mechanical phenomenon produced by photon pressure from two-photon interference. The differences in chirping frequency with medium disappears when second-order correlations are removed by dual balanced detection, confirming the proposed mechanism. We demonstrated that increasing position uncertainty at one site leads to position uncertainty (quantum position probability amplitude) nonlocally via second-order correlations (two-photon probability amplitude) from a low coherence thermal source (low purity, high local entropy). The implications, first, are that the phenomenon cannot be explained through classical mechanisms but can be explained within the context of quantum mechanics, particularly relevant to the second-order correlations where controversy exists. More specifically, we provide the theoretical framework that these results indicate a nonlocal macroscopic superposition is occurring through a two-photon probability amplitude-induced increase in the target position probability amplitude uncertainty. In addition, as the experiments were performed with a classical source at room temperature, it supports both the quantum-mechanical properties of second-order correlations and that macroscopic superposition is obtainable in a target not in a single coherent state (mixed state). Future work will focus on generalizing the observations outside the current experimental design and creating embodiments that allow practical application of the phenomenon.

  11. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  12. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  13. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  14. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single-stage-to-orbit (SSTO) reusable launch vehicle concept, developed by the NASA Langley Research Center under the Space Launch Initiative, provides the validation case for this work, with the focus placed on economics, aerothermodynamics, propulsion, and structures metrics. (Abstract shortened by UMI.)

  15. Trends in total column ozone measurements

    NASA Technical Reports Server (NTRS)

    Rowland, F. S.; Angell, J.; Attmannspacher, W.; Bloomfield, P.; Bojkov, R. D.; Harris, N.; Komhyr, W.; Mcfarland, M.; Mcpeters, R.; Stolarski, R. S.

    1989-01-01

    It is important to ensure the best available data are used in any determination of possible trends in total ozone in order to have the most accurate estimates of any trends and the associated uncertainties. Accordingly, the existing total ozone records were examined in considerable detail. Once the best data set has been produced, the statistical analysis must examine the data for any effects that might indicate changes in the behavior of global total ozone. The changes at any individual measuring station could be local in nature, and herein, particular attention was paid to the seasonal and latitudinal variations of total ozone, because two dimensional photochemical models indicate that any changes in total ozone would be most pronounced at high latitudes during the winter months. The conclusions derived from this detailed examination of available total ozone can be split into two categories, one concerning the quality and the other the statistical analysis of the total ozone record.

  16. Towards the certification of the purity of calibrant reference materials for thyroid hormones: a chicken and egg dilemma.

    PubMed

    Toussaint, B; Schimmel, H; Klein, C L; Wiergowski, M; Emons, H

    2007-07-13

    The certification of the purity of CRMs intended for calibration, where no other certified material already exists for comparison, raises principle questions on how to determine the purity of a "first" calibrant in the calibration hierarchy. We developed and certified two calibration CRMs for their purity in thyroid hormones taking into consideration inorganic residues, residual solvents and organic impurities detectable by HPLC-UV and HPLC-MS. IRMM-468 was certified for a thyroxine (T(4)) mass fraction of 98.6+/-0.7% and IRMM-469 was certified for a 3,3',5-triiodothyronine (T(3)) mass fraction of 97.1+/-0.7%. The approach we used aims to determine the purity of these two CRMs to the best of our knowledge and taking all scientific aspects properly into account for the estimation of an uncertainty related to the stated purity.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marnay, Chris; Siddiqui, Afzal; Marnay, Chris

    This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundarymore » and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.« less

  18. Methodology for Evaluating Security Controls Based on Key Performance Indicators and Stakeholder Mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, Frederick T; Abercrombie, Robert K; Mili, Ali

    2009-01-01

    Information security continues to evolve in response to disruptive changes with a persistent focus on information-centric controls and a healthy debate about balancing endpoint and network protection, with a goal of improved enterprise/business risk management. Economic uncertainty, intensively collaborative styles of work, virtualization, increased outsourcing and ongoing compliance pressures require careful consideration and adaptation. This paper proposes a Cyberspace Security Econometrics System (CSES) that provides a measure (i.e., a quantitative indication) of reliability, performance and/or safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. Formore » a given stakeholder, CSES reflects the variance that may exist among the stakes she/he attaches to meeting each requirement. This paper introduces the basis, objectives and capabilities for the CSES including inputs/outputs as well as the structural and mathematical underpinnings.« less

  19. Synopsis of Evaluating Security Controls Based on Key Performance Indicators and Stakeholder Mission Value

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2008-01-01

    Information security continues to evolve in response to disruptive changes with a persistent focus on information-centric controls and a healthy debate about balancing endpoint and network protection, with the goal of improved enterprise and business risk management. Economic uncertainty, intensively collaborative work styles, virtualization, increased outsourcing and ongoing compliance pressures require careful consideration and adaptation of a balanced approach. The Cyberspace Security Econometrics System (CSES) provides a measure of reliability, security and safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. For a given stakeholder,more » CSES reflects the variance that may exist among the stakes one attaches to meeting each requirement. This paper summarizes the basis, objectives and capabilities for the CSES including inputs/outputs as well as the structural underpinnings.« less

  20. Sensitivity Analysis of earth and environmental models: a systematic review to guide scientific advancement

    NASA Astrophysics Data System (ADS)

    Wagener, Thorsten; Pianosi, Francesca

    2016-04-01

    Sensitivity Analysis (SA) investigates how the variation in the output of a numerical model can be attributed to variations of its input factors. SA is increasingly being used in earth and environmental modelling for a variety of purposes, including uncertainty assessment, model calibration and diagnostic evaluation, dominant control analysis and robust decision-making. Here we provide some practical advice regarding best practice in SA and discuss important open questions based on a detailed recent review of the existing body of work in SA. Open questions relate to the consideration of input factor interactions, methods for factor mapping and the formal inclusion of discrete factors in SA (for example for model structure comparison). We will analyse these questions using relevant examples and discuss possible ways forward. We aim at stimulating the discussion within the community of SA developers and users regarding the setting of good practices and on defining priorities for future research.

  1. Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks

    NASA Astrophysics Data System (ADS)

    Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.

    2017-12-01

    The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.

  2. Determining hospital performance based on rank ordering: is it appropriate?

    PubMed

    Anderson, Judy; Hackman, Mark; Burnich, Jeff; Gurgiolo, Thomas R

    2007-01-01

    An increasing number of "pay for performance" initiatives for hospitals and physicians ascribe performance by ranking hospitals or physicians on quality of care measures. Payment is subsequently based on where a hospital or physician ranks among peers. This study examines the variability of ranking hospitals on quality of care measures and its impact on comparing hospital performance. Variability in the ranks of 3 quality of care measures was examined: discharge instruction for congestive heart failure, use of beta-blockers at discharge for heart attack, and timing of initial antibiotic therapy within 4 hours of admission to the hospital for pneumonia. The data are available on the Centers for Medicare and Medicaid Services Web site as part of the Hospital Quality Alliance project. We found that considerable uncertainty exists in ranking of hospitals on these measures, which calls into question the use of rank ordering as a determinant of performance.

  3. The Mathematics of Dispatchability, Revisited

    NASA Technical Reports Server (NTRS)

    Morris, Paul

    2016-01-01

    Dispatchability is an important property for the efficient execution of temporal plans where the temporal constraints are represented as a Simple Temporal Network (STN). It has been shown that every STN may be reformulated as a dispatchable STN, and dispatchability ensures that the temporal constraints need only be satisfied locally during execution. Recently, it has also been shown that Simple Temporal Networks with Uncertainty, augmented with wait edges, are Dynamically Controllable provided every projection is dispatchable. Thus, dispatchability has considerable theoretical as well as practical significance. One thing that hampers further work in this area is the underdeveloped theory. Moreover, the existing foundation is inadequate in certain respects. In this paper, we develop a new mathematical theory of dispatchability and its relationship to execution. We also provide several characterizations of dispatchability, including characterizations in terms of the structural properties of the STN graph. This facilitates the potential application of the theory to other areas.

  4. Beyond pairwise strategy updating in the prisoner's dilemma game

    NASA Astrophysics Data System (ADS)

    Wang, Xiaofeng; Perc, Matjaž; Liu, Yongkui; Chen, Xiaojie; Wang, Long

    2012-10-01

    In spatial games players typically alter their strategy by imitating the most successful or one randomly selected neighbor. Since a single neighbor is taken as reference, the information stemming from other neighbors is neglected, which begets the consideration of alternative, possibly more realistic approaches. Here we show that strategy changes inspired not only by the performance of individual neighbors but rather by entire neighborhoods introduce a qualitatively different evolutionary dynamics that is able to support the stable existence of very small cooperative clusters. This leads to phase diagrams that differ significantly from those obtained by means of pairwise strategy updating. In particular, the survivability of cooperators is possible even by high temptations to defect and over a much wider uncertainty range. We support the simulation results by means of pair approximations and analysis of spatial patterns, which jointly highlight the importance of local information for the resolution of social dilemmas.

  5. Cyberspace Security Econometrics System (CSES) - U.S. Copyright TXu 1-901-039

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security continues to evolve in response to disruptive changes with a persistent focus on information-centric controls and a healthy debate about balancing endpoint and network protection, with a goal of improved enterprise/business risk management. Economic uncertainty, intensively collaborative styles of work, virtualization, increased outsourcing and ongoing compliance pressures require careful consideration and adaptation. The Cyberspace Security Econometrics System (CSES) provides a measure (i.e., a quantitative indication) of reliability, performance, and/or safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. For a given stakeholder, CSESmore » accounts for the variance that may exist among the stakes one attaches to meeting each requirement. The basis, objectives and capabilities for the CSES including inputs/outputs as well as the structural and mathematical underpinnings contained in this copyright.« less

  6. On the status of IAEA delta-13C stable isotope reference materials.

    NASA Astrophysics Data System (ADS)

    Assonov, Sergey; Groening, Manfred; Fajgelj, Ales

    2016-04-01

    For practical reasons all isotope measurements are performed on relative scales realized through the use of international, scale-defining primary standards. In fact these standards were materials (artefacts, similar to prototypes of meter and kg) selected based on their properties. The VPDB delta-13C scale is realised via two highest-level reference materials NBS19 and LSVEC, the first defining the scale and the second aimed to normalise lab-to-lab calibrations. These two reference materials (RMs) have been maintained and distributed by IAEA and NIST. The priority task is to maintain these primary RMs at the required uncertainty level, thus ensuring the long-term scale consistency. The second task is to introduce replacements when needed (currently for exhausted NBS19, work in progress). The next is to produce a family of lower level RMs (secondary, tertiary) addressing needs of various applications (with different delta values, in different physical-chemical forms) and their needs for the uncertainty; these RMs should be traceable to the highest level RMs. Presently three is a need for a range of RMs addressing existing and newly emerging analytical techniques (e.g. optical isotopic analysers) in form of calibrated CO2 gases with different delta-13C values. All that implies creating a family of delta-13C stable isotope reference materials. Presently IAEA works on replacement for NBS19 and planning new RMs. Besides, we found that LSVEC (introduced as second anchor for the VPDB scale in 2006) demonstrate a considerable scatter of its delta-13C value which implies a potential bias of the property value and increased value uncertainty which may conflict with uncertainty requirements for atmospheric monitoring. That is not compatible with the status of LSVEC, and therefore it should be replaced as soon as possible. The presentation will give an overview of the current status, the strategic plan of developments and the near future steps.

  7. An objective Bayesian analysis of a crossover design via model selection and model averaging.

    PubMed

    Li, Dandan; Sivaganesan, Siva

    2016-11-10

    Inference about the treatment effect in a crossover design has received much attention over time owing to the uncertainty in the existence of the carryover effect and its impact on the estimation of the treatment effect. Adding to this uncertainty is that the existence of the carryover effect and its size may depend on the presence of the treatment effect and its size. We consider estimation and testing hypothesis about the treatment effect in a two-period crossover design, assuming normally distributed response variable, and use an objective Bayesian approach to test the hypothesis about the treatment effect and to estimate its size when it exists while accounting for the uncertainty about the presence of the carryover effect as well as the treatment and period effects. We evaluate and compare the performance of the proposed approach with a standard frequentist approach using simulated data, and real data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. An Optimization-Based Approach to Determine System Requirements Under Multiple-Domain Specific Uncertainties

    DTIC Science & Technology

    2016-04-30

    determining the optimal design requirements of a new system, which will operate along with other existing systems to provide a set of overarching...passenger airline transportation (Mane et al., 2007; Govindaraju et al., 2015). Uncertainty in Fleet Operations The uncertainty associated with the...demand can provide the basis for a commercial passenger airline problem. The operations of the commercial air travel industry differ from military

  9. Rating curve uncertainty: A comparison of estimation methods

    USGS Publications Warehouse

    Mason, Jr., Robert R.; Kiang, Julie E.; Cohn, Timothy A.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The USGS is engaged in both internal development and collaborative efforts to evaluate existing methods for characterizing the uncertainty of streamflow measurements (gaugings), stage-discharge relations (ratings), and, ultimately, the streamflow records derived from them. This paper provides a brief overview of two candidate methods that may be used to characterize the uncertainty of ratings, and illustrates the results of their application to the ratings of the two USGS streamgages.

  10. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  11. Quantifying Groundwater Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E.; Foglia, L.

    2007-12-01

    Groundwater models are characterized by the (a) processes simulated, (b) boundary conditions, (c) initial conditions, (d) method of solving the equation, (e) parameterization, and (f) parameter values. Models are related to the system of concern using data, some of which form the basis of observations used most directly, through objective functions, to estimate parameter values. Here we consider situations in which parameter values are determined by minimizing an objective function. Other methods of model development are not considered because their ad hoc nature generally prohibits clear quantification of uncertainty. Quantifying prediction uncertainty ideally includes contributions from (a) to (f). The parameter values of (f) tend to be continuous with respect to both the simulated equivalents of the observations and the predictions, while many aspects of (a) through (e) are discrete. This fundamental difference means that there are options for evaluating the uncertainty related to parameter values that generally do not exist for other aspects of a model. While the methods available for (a) to (e) can be used for the parameter values (f), the inferential methods uniquely available for (f) generally are less computationally intensive and often can be used to considerable advantage. However, inferential approaches require calculation of sensitivities. Whether the numerical accuracy and stability of the model solution required for accurate sensitivities is more broadly important to other model uses is an issue that needs to be addressed. Alternative global methods can require 100 or even 1,000 times the number of runs needed by inferential methods, though methods of reducing the number of needed runs are being developed and tested. Here we present three approaches for quantifying model uncertainty and investigate their strengths and weaknesses. (1) Represent more aspects as parameters so that the computationally efficient methods can be broadly applied. This approach is attainable through universal model analysis software such as UCODE-2005, PEST, and joint use of these programs, which allow many aspects of a model to be defined as parameters. (2) Use highly parameterized models to quantify aspects of (e). While promising, this approach implicitly includes parameterizations that may be considered unreasonable if investigated explicitly, so that resulting measures of uncertainty may be too large. (3) Use a combination of inferential and global methods that can be facilitated using the new software MMA (Multi-Model Analysis), which is constructed using the JUPITER API. Here we consider issues related to the model discrimination criteria calculated by MMA.

  12. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    NASA Astrophysics Data System (ADS)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  13. Host Model Uncertainty in Aerosol Radiative Effects: the AeroCom Prescribed Experiment and Beyond

    NASA Astrophysics Data System (ADS)

    Stier, Philip; Schutgens, Nick; Bian, Huisheng; Boucher, Olivier; Chin, Mian; Ghan, Steven; Huneeus, Nicolas; Kinne, Stefan; Lin, Guangxing; Myhre, Gunnar; Penner, Joyce; Randles, Cynthia; Samset, Bjorn; Schulz, Michael; Yu, Hongbin; Zhou, Cheng; Bellouin, Nicolas; Ma, Xiaoyan; Yu, Fangqun; Takemura, Toshihiko

    2013-04-01

    Anthropogenic and natural aerosol radiative effects are recognized to affect global and regional climate. Multi-model "diversity" in estimates of the aerosol radiative effect is often perceived as a measure of the uncertainty in modelling aerosol itself. However, current aerosol models vary considerably in model components relevant for the calculation of aerosol radiative forcings and feedbacks and the associated "host-model uncertainties" are generally convoluted with the actual uncertainty in aerosol modelling. In the AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in eleven participating models. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention. However, uncertainties in aerosol radiative effects also include short-term and long-term feedback processes that will be systematically explored in future intercomparison studies. Here we will present an overview of the proposals for discussion and results from early scoping studies.

  14. Known unknowns: building an ethics of uncertainty into genomic medicine.

    PubMed

    Newson, Ainsley J; Leonard, Samantha J; Hall, Alison; Gaff, Clara L

    2016-09-01

    Genomic testing has reached the point where, technically at least, it can be cheaper to undertake panel-, exome- or whole genome testing than it is to sequence a single gene. An attribute of these approaches is that information gleaned will often have uncertain significance. In addition to the challenges this presents for pre-test counseling and informed consent, a further consideration emerges over how - ethically - we should conceive of and respond to this uncertainty. To date, the ethical aspects of uncertainty in genomics have remained under-explored. In this paper, we draft a conceptual and ethical response to the question of how to conceive of and respond to uncertainty in genomic medicine. After introducing the problem, we articulate a concept of 'genomic uncertainty'. Drawing on this, together with exemplar clinical cases and related empirical literature, we then critique the presumption that uncertainty is always problematic and something to be avoided, or eradicated. We conclude by outlining an 'ethics of genomic uncertainty'; describing how we might handle uncertainty in genomic medicine. This involves fostering resilience, welfare, autonomy and solidarity. Uncertainty will be an inherent aspect of clinical practice in genomics for some time to come. Genomic testing should not be offered with the explicit aim to reduce uncertainty. Rather, uncertainty should be appraised, adapted to and communicated about as part of the process of offering and providing genomic information.

  15. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  16. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  17. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    NASA Astrophysics Data System (ADS)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  18. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  19. Estimating winter wheat phenological parameters: Implications for crop modeling

    USDA-ARS?s Scientific Manuscript database

    Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...

  20. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  1. Re-evaluating alkenone based CO2 estimates

    NASA Astrophysics Data System (ADS)

    Pagani, M.

    2013-05-01

    Multi-million year patterns of ocean temperatures and ice accumulation are relatively consistent with reconstructed CO2 records. Existing records allow for broad statements regarding climate sensitivity, but uncertainties in reconstructions can lead to considerable error. For example, alkenone-based CO2 reconstructions assume that diffusion of CO2aq is the dominant source of inorganic carbon for photosynthesis. However, the concentration of CO2aq is the lowest of all dissolved carbon species, constituting <1% of the total inorganic aqueous pool. This poses a problem for sustaining reasonable algal growth rates because the half saturation constant for the enzyme Rubisco, the primary carboxylase involved in algal photosythesis, is commonly higher than the average concentration of seawater CO2aq. That is, the concentration of CO2aq in the modern ocean is too low to maintain adequate reactions rates for Rubisco, and thus, algal growth. In order to maintain algal growth rates, most modern algae have strategies to increase intercellular CO2 concentrations. But, if such strategies were prevalent for alkenone-producing algae in the past, CO2 reconstructions could be compromised. This presentation will assess time periods when carbon-concentration strategies were potentially in play and consequences for existing CO2 records.

  2. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    NASA Astrophysics Data System (ADS)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  3. Robust design optimization method for centrifugal impellers under surface roughness uncertainties due to blade fouling

    NASA Astrophysics Data System (ADS)

    Ju, Yaping; Zhang, Chuhua

    2016-03-01

    Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.

  4. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information aboutmore » the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.« less

  5. NOy and O3 in the Asian Monsoon Anticyclone: Uncertainties associated with the Convection and Lightning in a Global Model

    NASA Astrophysics Data System (ADS)

    Pozzer, A.; Ojha, N.; Tost, H.; Joeckel, P.; Fischer, H.; Ziereis, H.; Zahn, A.; Tomsche, L.; Lelieveld, J.

    2017-12-01

    The impacts of Asian monsoon on the tropospheric chemistry are difficult to simulate in numerical models due to the lack of accurate emission inventories over the Asian region and the strong influence of parameterized processes such as convection and lightning. Further, the lack of observational data over the region during the monsoon period reduce drastically the capability to evaluate numerical models. Here, we combine simulations using the global EMAC (ECHAM5/MESSy2 Atmospheric Chemistry) model with the observational dataset based on the OMO campaign (July-August 2015) to study the tropospheric composition in the Asian monsoon anticyclone. The results of the simulations capture the C-shape of the CO vertical profiles, typically observed during the summer monsoon. The observed spatio-temporal variations in O3, CO, and NOy are reproduced by EMAC, with a better correlation in the upper troposphere (UT). However, the model overestimates NOy and O3 mixing ratios in the anticyclone by 25% and 35%, respectively. A series of numerical experiments showed the strong lightning emissions in the model as the source of this overestimation, with the anthropogenic NOx sources (in Asia) and global soil emissions having lower impact in the UT. A reduction of the lightning NOx emission by 50% leads to a better agreement between the model and OMO observations of NOy and O3. The uncertainties in the lightning emissions are found to considerably influence the OH distribution in the UT over India and downwind. The study reveals existing uncertainties in the estimations of monsoon impact on the tropospheric composition, and highlights the need to constrain numerical simulations with state-of-the-art observations for deriving the budget of trace species of climatic relevance.

  6. Uncertainties in slip-rate estimates for the Mission Creek strand of the southern San Andreas fault at Biskra Palms Oasis, southern California

    USGS Publications Warehouse

    Behr, W.M.; Rood, D.H.; Fletcher, K.E.; Guzman, N.; Finkel, R.; Hanks, T.C.; Hudnut, K.W.; Kendrick, K.J.; Platt, J.P.; Sharp, W.D.; Weldon, R.J.; Yule, J.D.

    2010-01-01

    This study focuses on uncertainties in estimates of the geologic slip rate along the Mission Creek strand of the southern San Andreas fault where it offsets an alluvial fan (T2) at Biskra Palms Oasis in southern California. We provide new estimates of the amount of fault offset of the T2 fan based on trench excavations and new cosmogenic 10Be age determinations from the tops of 12 boulders on the fan surface. We present three alternative fan offset models: a minimum, a maximum, and a preferred offset of 660 m, 980 m, and 770 m, respectively. We assign an age of between 45 and 54 ka to the T2 fan from the 10Be data, which is significantly older than previously reported but is consistent with both the degree of soil development associated with this surface, and with ages from U-series geochronology on pedogenic carbonate from T2, described in a companion paper by Fletcher et al. (this volume). These new constraints suggest a range of slip rates between ~12 and 22 mm/yr with a preferred estimate of ~14-17 mm/yr for the Mission Creek strand of the southern San Andreas fault. Previous studies suggested that the geologic and geodetic slip-rate estimates at Biskra Palms differed. We find, however, that considerable uncertainty affects both the geologic and geodetic slip-rate estimates, such that if a real discrepancy between these rates exists for the southern San Andreas fault at Biskra Palms, it cannot be demonstrated with available data. ?? 2010 Geological Society of America.

  7. Persistent oscillations and backward bifurcation in a malaria model with varying human and mosquito populations: implications for control.

    PubMed

    Ngonghala, Calistus N; Teboh-Ewungkem, Miranda I; Ngwa, Gideon A

    2015-06-01

    We derive and study a deterministic compartmental model for malaria transmission with varying human and mosquito populations. Our model considers disease-related deaths, asymptomatic immune humans who are also infectious, as well as mosquito demography, reproduction and feeding habits. Analysis of the model reveals the existence of a backward bifurcation and persistent limit cycles whose period and size is determined by two threshold parameters: the vectorial basic reproduction number Rm, and the disease basic reproduction number R0, whose size can be reduced by reducing Rm. We conclude that malaria dynamics are indeed oscillatory when the methodology of explicitly incorporating the mosquito's demography, feeding and reproductive patterns is considered in modeling the mosquito population dynamics. A sensitivity analysis reveals important control parameters that can affect the magnitudes of Rm and R0, threshold quantities to be taken into consideration when designing control strategies. Both Rm and the intrinsic period of oscillation are shown to be highly sensitive to the mosquito's birth constant λm and the mosquito's feeding success probability pw. Control of λm can be achieved by spraying, eliminating breeding sites or moving them away from human habitats, while pw can be controlled via the use of mosquito repellant and insecticide-treated bed-nets. The disease threshold parameter R0 is shown to be highly sensitive to pw, and the intrinsic period of oscillation is also sensitive to the rate at which reproducing mosquitoes return to breeding sites. A global sensitivity and uncertainty analysis reveals that the ability of the mosquito to reproduce and uncertainties in the estimations of the rates at which exposed humans become infectious and infectious humans recover from malaria are critical in generating uncertainties in the disease classes.

  8. Worldwide data sets constrain the water vapor uptake coefficient in cloud formation

    PubMed Central

    Raatikainen, Tomi; Nenes, Athanasios; Seinfeld, John H.; Morales, Ricardo; Moore, Richard H.; Lathem, Terry L.; Lance, Sara; Padró, Luz T.; Lin, Jack J.; Cerully, Kate M.; Bougiatioti, Aikaterini; Cozic, Julie; Ruehl, Christopher R.; Chuang, Patrick Y.; Anderson, Bruce E.; Flagan, Richard C.; Jonsson, Haflidi; Mihalopoulos, Nikos; Smith, James N.

    2013-01-01

    Cloud droplet formation depends on the condensation of water vapor on ambient aerosols, the rate of which is strongly affected by the kinetics of water uptake as expressed by the condensation (or mass accommodation) coefficient, αc. Estimates of αc for droplet growth from activation of ambient particles vary considerably and represent a critical source of uncertainty in estimates of global cloud droplet distributions and the aerosol indirect forcing of climate. We present an analysis of 10 globally relevant data sets of cloud condensation nuclei to constrain the value of αc for ambient aerosol. We find that rapid activation kinetics (αc > 0.1) is uniformly prevalent. This finding resolves a long-standing issue in cloud physics, as the uncertainty in water vapor accommodation on droplets is considerably less than previously thought. PMID:23431189

  9. Worldwide data sets constrain the water vapor uptake coefficient in cloud formation.

    PubMed

    Raatikainen, Tomi; Nenes, Athanasios; Seinfeld, John H; Morales, Ricardo; Moore, Richard H; Lathem, Terry L; Lance, Sara; Padró, Luz T; Lin, Jack J; Cerully, Kate M; Bougiatioti, Aikaterini; Cozic, Julie; Ruehl, Christopher R; Chuang, Patrick Y; Anderson, Bruce E; Flagan, Richard C; Jonsson, Haflidi; Mihalopoulos, Nikos; Smith, James N

    2013-03-05

    Cloud droplet formation depends on the condensation of water vapor on ambient aerosols, the rate of which is strongly affected by the kinetics of water uptake as expressed by the condensation (or mass accommodation) coefficient, αc. Estimates of αc for droplet growth from activation of ambient particles vary considerably and represent a critical source of uncertainty in estimates of global cloud droplet distributions and the aerosol indirect forcing of climate. We present an analysis of 10 globally relevant data sets of cloud condensation nuclei to constrain the value of αc for ambient aerosol. We find that rapid activation kinetics (αc > 0.1) is uniformly prevalent. This finding resolves a long-standing issue in cloud physics, as the uncertainty in water vapor accommodation on droplets is considerably less than previously thought.

  10. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  11. Uncertainties in derived temperature-height profiles

    NASA Technical Reports Server (NTRS)

    Minzner, R. A.

    1974-01-01

    Nomographs were developed for relating uncertainty in temperature T to uncertainty in the observed height profiles of both pressure p and density rho. The relative uncertainty delta T/T is seen to depend not only upon the relative uncertainties delta P/P or delta rho/rho, and to a small extent upon the value of T or H, but primarily upon the sampling-height increment Delta h, the height increment between successive observations of p or delta. For a fixed value of delta p/p, the value of delta T/T varies inversely with Delta h. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperatures with unacceptably large uncertainties.

  12. From Rupture to Resonance: Uncertainty and Scholarship in Fine Art Research Degrees

    ERIC Educational Resources Information Center

    Simmons, Beverley; Holbrook, Allyson

    2013-01-01

    This article focuses on the phenomenon of "rupture" identified in student narratives of uncertainty and scholarship experienced during the course of Fine Art research degrees in two Australian universities. Rupture captures the phenomenon of severe disruption or discontinuity in existing knowledge and typically signifies epistemological…

  13. Residual Seminal Vesicle Displacement in Marker-Based Image-Guided Radiotherapy for Prostate Cancer and the Impact on Margin Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smitsmans, Monique H.P.; Bois, Josien de; Sonke, Jan-Jakob

    Purpose: The objectives of this study were to quantify residual interfraction displacement of seminal vesicles (SV) and investigate the efficacy of rotation correction on SV displacement in marker-based prostate image-guided radiotherapy (IGRT). We also determined the effect of marker registration on the measured SV displacement and its impact on margin design. Methods and Materials: SV displacement was determined relative to marker registration by using 296 cone beam computed tomography scans of 13 prostate cancer patients with implanted markers. SV were individually registered in the transverse plane, based on gray-value information. The target registration error (TRE) for the SV due tomore » marker registration inaccuracies was estimated. Correlations between prostate gland rotations and SV displacement and between individual SV displacements were determined. Results: The SV registration success rate was 99%. Displacement amounts of both SVs were comparable. Systematic and random residual SV displacements were 1.6 mm and 2.0 mm in the left-right direction, respectively, and 2.8 mm and 3.1 mm in the anteroposterior (AP) direction, respectively. Rotation correction did not reduce residual SV displacement. Prostate gland rotation around the left-right axis correlated with SV AP displacement (R{sup 2} = 42%); a correlation existed between both SVs for AP displacement (R{sup 2} = 62%); considerable correlation existed between random errors of SV displacement and TRE (R{sup 2} = 34%). Conclusions: Considerable residual SV displacement exists in marker-based IGRT. Rotation correction barely reduced SV displacement, rather, a larger SV displacement was shown relative to the prostate gland that was not captured by the marker position. Marker registration error partly explains SV displacement when correcting for rotations. Correcting for rotations, therefore, is not advisable when SV are part of the target volume. Margin design for SVs should take these uncertainties into account.« less

  14. Life-cycle assessment of municipal solid waste management alternatives with consideration of uncertainty: SIWMS development and application.

    PubMed

    Hanandeh, Ali El; El-Zein, Abbas

    2010-05-01

    This paper describes the development and application of the Stochastic Integrated Waste Management Simulator (SIWMS) model. SIWMS provides a detailed view of the environmental impacts and associated costs of municipal solid waste (MSW) management alternatives under conditions of uncertainty. The model follows a life-cycle inventory approach extended with compensatory systems to provide more equitable bases for comparing different alternatives. Economic performance is measured by the net present value. The model is verified against four publicly available models under deterministic conditions and then used to study the impact of uncertainty on Sydney's MSW management 'best practices'. Uncertainty has a significant effect on all impact categories. The greatest effect is observed in the global warming category where a reversal of impact direction is predicted. The reliability of the system is most sensitive to uncertainties in the waste processing and disposal. The results highlight the importance of incorporating uncertainty at all stages to better understand the behaviour of the MSW system. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Internal Variability-Generated Uncertainty in East Asian Climate Projections Estimated with 40 CCSM3 Ensembles.

    PubMed

    Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang

    2016-01-01

    Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.

  16. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  17. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  18. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  19. Harnessing the uncertainty monster: Putting quantitative constraints on the intergenerational social discount rate

    NASA Astrophysics Data System (ADS)

    Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.

    2017-09-01

    There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.

  20. Interpolation Method Needed for Numerical Uncertainty

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, Marcel; Schallhorn, Paul A.

    2014-01-01

    Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors.

  1. Nuclear Data Uncertainties for Typical LWR Fuel Assemblies and a Simple Reactor Core

    NASA Astrophysics Data System (ADS)

    Rochman, D.; Leray, O.; Hursin, M.; Ferroukhi, H.; Vasiliev, A.; Aures, A.; Bostelmann, F.; Zwermann, W.; Cabellos, O.; Diez, C. J.; Dyrda, J.; Garcia-Herranz, N.; Castro, E.; van der Marck, S.; Sjöstrand, H.; Hernandez, A.; Fleming, M.; Sublet, J.-Ch.; Fiorito, L.

    2017-01-01

    The impact of the current nuclear data library covariances such as in ENDF/B-VII.1, JEFF-3.2, JENDL-4.0, SCALE and TENDL, for relevant current reactors is presented in this work. The uncertainties due to nuclear data are calculated for existing PWR and BWR fuel assemblies (with burn-up up to 40 GWd/tHM, followed by 10 years of cooling time) and for a simplified PWR full core model (without burn-up) for quantities such as k∞, macroscopic cross sections, pin power or isotope inventory. In this work, the method of propagation of uncertainties is based on random sampling of nuclear data, either from covariance files or directly from basic parameters. Additionally, possible biases on calculated quantities are investigated such as the self-shielding treatment. Different calculation schemes are used, based on CASMO, SCALE, DRAGON, MCNP or FISPACT-II, thus simulating real-life assignments for technical-support organizations. The outcome of such a study is a comparison of uncertainties with two consequences. One: although this study is not expected to lead to similar results between the involved calculation schemes, it provides an insight on what can happen when calculating uncertainties and allows to give some perspectives on the range of validity on these uncertainties. Two: it allows to dress a picture of the state of the knowledge as of today, using existing nuclear data library covariances and current methods.

  2. Exemplifying the Effects of Parameterization Shortcomings in the Numerical Simulation of Geological Energy and Mass Storage

    NASA Astrophysics Data System (ADS)

    Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk

    2016-04-01

    Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.

  3. The impact of shale gas on the cost and feasibility of meeting climate targets—A global energy system model analysis and an exploration of uncertainties

    DOE PAGES

    Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn; ...

    2017-01-27

    There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less

  4. Analysis of typical WWER-1000 severe accident scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorokin, Yu.S.; Shchekoldin, V.V.; Borisov, L.N.

    2004-07-01

    At present in EDO 'Gidropress' there is a certain experience of performing the analyses of severe accidents of reactor plant with WWER with application of domestic and foreign codes. Important data were also obtained by the results of calculation modeling of integrated experiments with fuel assembly melting comprising a real fuel. Systematization and consideration of these data in development and assimilation of codes are extremely important in connection with large uncertainty still existing in understanding and adequate description of phenomenology of severe accidents. The presented report gives a comparison of analysis results of severe accidents of reactor plant with WWER-1000more » for two typical scenarios made by using American MELCOR code and the Russian RATEG/SVECHA/HEFEST code. The results of calculation modeling are compared using above codes with the data of experiment FPT1 with fuel assembly melting comprising a real fuel, which has been carried out at the facility Phebus (France). The obtained results are considered in the report from the viewpoint of: - adequacy of results of calculation modeling of separate phenomena during severe accidents of RP with WWER by using the above codes; - influence of uncertainties (degree of details of calculation models, choice of parameters of models etc.); - choice of those or other setup variables (options) in the used codes; - necessity of detailed modeling of processes and phenomena as applied to design justification of safety of RP with WWER. (authors)« less

  5. A place for genetic uncertainty: parents valuing an unknown in the meaning of disease.

    PubMed

    Whitmarsh, Ian; Davis, Arlene M; Skinner, Debra; Bailey, Donald B

    2007-09-01

    Klinefelter, Turner, and fragile X syndromes are conditions defined by a genetic or chromosomal variant. The timing of diagnosis, tests employed, specialists involved, symptoms evident, and prognoses available vary considerably within and across these syndromes, but all three share in common a diagnosis verified through a molecular or cytogenetic test. The genetic or chromosomal variant identified designates a syndrome, even when symptoms associated with the particular syndrome are absent. This article analyzes interviews conducted with parents and grandparents of children with these syndromes from across the USA to explore how they interpret a confirmed genetic diagnosis that is associated with a range of possible symptoms that may never be exhibited. Parents' responses indicate that they see the genetic aspects of the syndrome as stable, permanent, and authoritative. But they allow, and even embrace, uncertainty about the condition by focusing on variation between diagnosed siblings, the individuality of their diagnosed child, his or her accomplishments, and other positive aspects that go beyond the genetic diagnosis. Some families counter the genetic diagnosis by arguing that in the absence of symptoms, the syndrome does not exist. They use their own expertise to question the perceived certainty of the genetic diagnosis and to employ the diagnosis strategically. These multiple and often conflicting evaluations of the diagnostic label reveal the rich ways families make meaning of the authority attributed to genetic diagnosis.

  6. The Value of Heterogeneity for Cost-Effectiveness Subgroup Analysis

    PubMed Central

    Manca, Andrea; Claxton, Karl; Sculpher, Mark J.

    2014-01-01

    This article develops a general framework to guide the use of subgroup cost-effectiveness analysis for decision making in a collectively funded health system. In doing so, it addresses 2 key policy questions, namely, the identification and selection of subgroups, while distinguishing 2 sources of potential value associated with heterogeneity. These are 1) the value of revealing the factors associated with heterogeneity in costs and outcomes using existing evidence (static value) and 2) the value of acquiring further subgroup-related evidence to resolve the uncertainty given the current understanding of heterogeneity (dynamic value). Consideration of these 2 sources of value can guide subgroup-specific treatment decisions and inform whether further research should be conducted to resolve uncertainty to explain variability in costs and outcomes. We apply the proposed methods to a cost-effectiveness analysis for the management of patients with acute coronary syndrome. This study presents the expected net benefits under current and perfect information when subgroups are defined based on the use and combination of 6 binary covariates. The results of the case study confirm the theoretical expectations. As more subgroups are considered, the marginal net benefit gains obtained under the current information show diminishing marginal returns, and the expected value of perfect information shows a decreasing trend. We present a suggested algorithm that synthesizes the results to guide policy. PMID:24944196

  7. The sequestration switch: removing industrial CO2 by direct ocean absorption.

    PubMed

    Ametistova, Lioudmila; Twidell, John; Briden, James

    2002-04-22

    This review paper considers direct injection of industrial CO2 emissions into the mid-water oceanic column below 500 m depth. Such a process is a potential candidate for switching atmospheric carbon emissions directly to long term sequestration, thereby relieving the intermediate atmospheric burden. Given sufficient research justification, the argument is that harmful impact in both the Atmosphere and the biologically rich upper marine layer could be reduced. The paper aims to estimate the role that active intervention, through direct ocean CO2 storage, could play and to outline further research and assessment for the strategy to be a viable option for climate change mitigation. The attractiveness of direct ocean injection lies in its bypassing of the Atmosphere and upper marine region, its relative permanence, its practicability using existing technologies and its quantification. The difficulties relate to the uncertainty of some fundamental scientific issues, such as plume dynamics, lowered pH of the exposed waters and associated ecological impact, the significant energy penalty associated with the necessary engineering plant and the uncertain costs. Moreover, there are considerable uncertainties regarding related international marine law. Development of the process would require acceptance of the evidence for climate change, strict requirements for large industrial consumers of fossil fuel to reduce CO2 emissions into the Atmosphere and scientific evidence for the overall beneficial impact of ocean sequestration.

  8. The impact of shale gas on the cost and feasibility of meeting climate targets—A global energy system model analysis and an exploration of uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn

    There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less

  9. The value of heterogeneity for cost-effectiveness subgroup analysis: conceptual framework and application.

    PubMed

    Espinoza, Manuel A; Manca, Andrea; Claxton, Karl; Sculpher, Mark J

    2014-11-01

    This article develops a general framework to guide the use of subgroup cost-effectiveness analysis for decision making in a collectively funded health system. In doing so, it addresses 2 key policy questions, namely, the identification and selection of subgroups, while distinguishing 2 sources of potential value associated with heterogeneity. These are 1) the value of revealing the factors associated with heterogeneity in costs and outcomes using existing evidence (static value) and 2) the value of acquiring further subgroup-related evidence to resolve the uncertainty given the current understanding of heterogeneity (dynamic value). Consideration of these 2 sources of value can guide subgroup-specific treatment decisions and inform whether further research should be conducted to resolve uncertainty to explain variability in costs and outcomes. We apply the proposed methods to a cost-effectiveness analysis for the management of patients with acute coronary syndrome. This study presents the expected net benefits under current and perfect information when subgroups are defined based on the use and combination of 6 binary covariates. The results of the case study confirm the theoretical expectations. As more subgroups are considered, the marginal net benefit gains obtained under the current information show diminishing marginal returns, and the expected value of perfect information shows a decreasing trend. We present a suggested algorithm that synthesizes the results to guide policy. © The Author(s) 2014.

  10. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.

  11. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  12. [Practical experiences in legal counseling of foreign workers].

    PubMed

    Pestalozzi-Seger, G

    1992-09-01

    When foreign workers ask for legal advice, very often their questions concern primarily insurance rights for disability. Most uncertainties exist about specific clauses in the legislation on disability insurance and about the measurings of disability. Primarily, discussions arise from controversy about claims made to the state disability insurance. The legislation on disability insurance establishes strict requirements for foreigners asking for insurance rights for disability. However, the Agreement on Social Security signed worldwide by over 20 nations being more tolerant in terms of disability insurance, Swiss legislation can be applied only to a minority of foreigners. That is why the system of legislation has become so complex. There are two major points that are rigidly to be observed: On one hand, the process of reintegration measures can start only if the prescribed minimum duration of contributions is guaranteed. On the other, proceedings for disability pensions can be initiated only after the currently valid waiting period. In both cases, it is considerably important that the patient has a domicile in Switzerland or a valid residence permit. Numerous disagreements can possibly result during the evaluation of the degree of disability, as certain factors-such as language problems, lack of education or the labour market situation-, which are not directly linked to the disability, are not taken into consideration.

  13. Characterisation of a reference site for quantifying uncertainties related to soil sampling.

    PubMed

    Barbizzi, Sabrina; de Zorzi, Paolo; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter

    2004-01-01

    The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the "fit-for-purpose" method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated.

  14. Detectability limit and uncertainty considerations for laser induced fluorescence spectroscopy in flames

    NASA Technical Reports Server (NTRS)

    Daily, J. W.

    1978-01-01

    Laser induced fluorescence spectroscopy of flames is discussed, and derived uncertainty relations are used to calculate detectability limits due to statistical errors. Interferences due to Rayleigh scattering from molecules as well as Mie scattering and incandescence from particles have been examined for their effect on detectability limits. Fluorescence trapping is studied, and some methods for reducing the effect are considered. Fluorescence trapping places an upper limit on the number density of the fluorescing species that can be measured without signal loss.

  15. Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector

    NASA Astrophysics Data System (ADS)

    Lenel, U. R.; Davies, D. G. S.; Moore, M. A.

    An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.

  16. Picosecond timing resolution detection of ggr-photons utilizing microchannel-plate detectors: experimental tests of quantum nonlocality and photon localization

    NASA Astrophysics Data System (ADS)

    Irby, Victor D.

    2004-09-01

    The concept and subsequent experimental verification of the proportionality between pulse amplitude and detector transit time for microchannel-plate detectors is presented. This discovery has led to considerable improvement in the overall timing resolution for detection of high-energy ggr-photons. Utilizing a 22Na positron source, a full width half maximum (FWHM) timing resolution of 138 ps has been achieved. This FWHM includes detector transit-time spread for both chevron-stack-type detectors, timing spread due to uncertainties in annihilation location, all electronic uncertainty and any remaining quantum mechanical uncertainty. The first measurement of the minimum quantum uncertainty in the time interval between detection of the two annihilation photons is reported. The experimental results give strong evidence against instantaneous spatial localization of ggr-photons due to measurement-induced nonlocal quantum wavefunction collapse. The experimental results are also the first that imply momentum is conserved only after the quantum uncertainty in time has elapsed (Yukawa H 1935 Proc. Phys. Math. Soc. Japan 17 48).

  17. The effects of geometric uncertainties on computational modelling of knee biomechanics

    NASA Astrophysics Data System (ADS)

    Meng, Qingen; Fisher, John; Wilcox, Ruth

    2017-08-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models.

  18. Planning Under Continuous Time and Resource Uncertainty: A Challenge for AI

    NASA Technical Reports Server (NTRS)

    Bresina, John; Dearden, Richard; Meuleau, Nicolas; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)

    2002-01-01

    There has been considerable work in Al on decision-theoretic planning and planning under uncertainty. Unfortunately, all of this work suffers from one or more of the following limitations: 1) it relies on very simple models of actions and time, 2) it assumes that uncertainty is manifested in discrete action outcomes, and 3) it is only practical for very small problems. For many real world problems, these assumptions fail to hold. A case in point is planning the activities for a Mars rover. For this domain none of the above assumptions are valid: 1) actions can be concurrent and have differing durations, 2) there is uncertainty concerning action durations and consumption of continuous resources like power, and 3) typical daily plans involve on the order of a hundred actions. We describe the rover problem, discuss previous work on planning under uncertainty, and present a detailed. but very small, example illustrating some of the difficulties of finding good plans.

  19. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  20. Behavior and design of large structural concrete bridge pier overhangs.

    DOT National Transportation Integrated Search

    1997-02-01

    In designing large cantilever bent caps for use on recent projects under current AASHTO design specifications, : designers were faced with considerable uncertainties. Questions arose when designers attempted to satisfy : both serviceability and stren...

  1. Simplified methods for real-time prediction of storm surge uncertainty: The city of Venice case study

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi

    2014-09-01

    Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.

  2. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2014-09-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties in the end results. We estimate uncertainties in economic data, multi-pollutant emission statistics and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. The economic data have a relatively small impact on uncertainty at the global and national level, while much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production based emissions, since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±9-±27% using the global temperature potential with a 50 year time horizon, with metric uncertainties dominating. National level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9-±25%, with metric and emissions uncertainties contributing similarly. The Absolute global temperature potential with a 50 year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  3. The impact of lake and reservoir parameterization on global streamflow simulation.

    PubMed

    Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke

    2017-05-01

    Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.

  4. Local setup errors in image-guided radiotherapy for head and neck cancer patients immobilized with a custom-made device.

    PubMed

    Giske, Kristina; Stoiber, Eva M; Schwarz, Michael; Stoll, Armin; Muenter, Marc W; Timke, Carmen; Roeder, Falk; Debus, Juergen; Huber, Peter E; Thieke, Christian; Bendl, Rolf

    2011-06-01

    To evaluate the local positioning uncertainties during fractionated radiotherapy of head-and-neck cancer patients immobilized using a custom-made fixation device and discuss the effect of possible patient correction strategies for these uncertainties. A total of 45 head-and-neck patients underwent regular control computed tomography scanning using an in-room computed tomography scanner. The local and global positioning variations of all patients were evaluated by applying a rigid registration algorithm. One bounding box around the complete target volume and nine local registration boxes containing relevant anatomic structures were introduced. The resulting uncertainties for a stereotactic setup and the deformations referenced to one anatomic local registration box were determined. Local deformations of the patients immobilized using our custom-made device were compared with previously published results. Several patient positioning correction strategies were simulated, and the residual local uncertainties were calculated. The patient anatomy in the stereotactic setup showed local systematic positioning deviations of 1-4 mm. The deformations referenced to a particular anatomic local registration box were similar to the reported deformations assessed from patients immobilized with commercially available Aquaplast masks. A global correction, including the rotational error compensation, decreased the remaining local translational errors. Depending on the chosen patient positioning strategy, the remaining local uncertainties varied considerably. Local deformations in head-and-neck patients occur even if an elaborate, custom-made patient fixation method is used. A rotational error correction decreased the required margins considerably. None of the considered correction strategies achieved perfect alignment. Therefore, weighting of anatomic subregions to obtain the optimal correction vector should be investigated in the future. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Conditional uncertainty principle

    NASA Astrophysics Data System (ADS)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  6. Cost-Effectiveness of Orthogeriatric and Fracture Liaison Service Models of Care for Hip Fracture Patients: A Population-Based Study.

    PubMed

    Leal, Jose; Gray, Alastair M; Hawley, Samuel; Prieto-Alhambra, Daniel; Delmestri, Antonella; Arden, Nigel K; Cooper, Cyrus; Javaid, M Kassim; Judge, Andrew

    2017-02-01

    Fracture liaison services are recommended as a model of best practice for organizing patient care and secondary fracture prevention for hip fracture patients, although variation exists in how such services are structured. There is considerable uncertainty as to which model is most cost-effective and should therefore be mandated. This study evaluated the cost- effectiveness of orthogeriatric (OG)- and nurse-led fracture liaison service (FLS) models of post-hip fracture care compared with usual care. Analyses were conducted from a health care and personal social services payer perspective, using a Markov model to estimate the lifetime impact of the models of care. The base-case population consisted of men and women aged 83 years with a hip fracture. The risk and costs of hip and non-hip fractures were derived from large primary and hospital care data sets in the UK. Utilities were informed by a meta-regression of 32 studies. In the base-case analysis, the orthogeriatric-led service was the most effective and cost-effective model of care at a threshold of £30,000 per quality-adjusted life years gained (QALY). For women aged 83 years, the OG-led service was the most cost-effective at £22,709/QALY. If only health care costs are considered, OG-led service was cost-effective at £12,860/QALY and £14,525/QALY for women and men aged 83 years, respectively. Irrespective of how patients were stratified in terms of their age, sex, and Charlson comorbidity score at index hip fracture, our results suggest that introducing an orthogeriatrician-led or a nurse-led FLS is cost-effective when compared with usual care. Although considerable uncertainty remains concerning which of the models of care should be preferred, introducing an orthogeriatrician-led service seems to be the most cost-effective service to pursue. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  7. Reimbursement of licensed cell and gene therapies across the major European healthcare markets

    PubMed Central

    Jørgensen, Jesper; Kefalas, Panos

    2015-01-01

    Objective The aim of this research is to identify the pricing, reimbursement, and market access (P&R&MA) considerations most relevant to advanced therapy medicinal products (ATMPs) in the Big5EU, and to inform their manufacturers about the key drivers for securing adoption at a commercially viable reimbursed price. Methodology The research was structured following three main steps: 1) Identifying the market access pathways relevant to ATMPs through secondary research; 2) Validating the secondary research findings and addressing any data gaps in primary research, by qualitative interviews with national, regional, and local-level payers and their clinical and economic advisors; 3) Collating of primary and secondary findings to compare results across countries. Results The incremental clinical benefit forms the basis for all P&R&MA processes. Budget impact is a key consideration, regardless of geography. Cost-effectiveness analyses are increasingly applied; however, only the United Kingdom has a defined threshold that links the cost per quality-adjusted life year (QALY) specifically and methodologically to the reimbursed price. Funding mechanisms to enable adoption of new and more expensive therapies exist in all countries, albeit to varying extents. Willingness to pay is typically higher in smaller patient populations, especially in populations with high disease burden. Outcomes modelling and risk-sharing agreements (RSAs) provide strategies to address the data gap and uncertainties often associated with trials in niche populations. Conclusions The high cost of ATMPs, coupled with the uncertainty at launch around their long-term claims, present challenges for their adoption at a commercially viable reimbursed price. Targeting populations of high disease burden and unmet needs may be advantageous, as the potential for improvement in clinical benefit is greater, as well as the potential for capitalising on healthcare cost offsets. Also, targeting small populations can also help reduce both payers’ budget impact concerns and the risk of reimbursement restrictions being imposed. PMID:27123175

  8. Uncertainties in climate data sets

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1992-01-01

    Climate diagnostics are constructed from either analyzed fields or from observational data sets. Those that have been commonly used are normally considered ground truth. However, in most of these collections, errors and uncertainties exist which are generally ignored due to the consistency of usage over time. Examples of uncertainties and errors are described in NMC and ECMWF analyses and in satellite observational sets-OLR, TOVS, and SMMR. It is suggested that these errors can be large, systematic, and not negligible in climate analysis.

  9. Inter-comparison of interpolated background nitrogen dioxide concentrations across Greater Manchester, UK

    NASA Astrophysics Data System (ADS)

    Lindley, S. J.; Walsh, T.

    There are many modelling methods dedicated to the estimation of spatial patterns in pollutant concentrations, each with their distinctive advantages and disadvantages. The derivation of a surface of air quality values from monitoring data alone requires the conversion of point-based data from a limited number of monitoring stations to a continuous surface using interpolation. Since interpolation techniques involve the estimation of data at un-sampled points based on calculated relationships between data measured at a number of known sample points, they are subject to some uncertainty, both in terms of the values estimated and their spatial distribution. These uncertainties, which are incorporated into many empirical and semi-empirical mapping methodologies, could be recognised in any further usage of the data and also in the assessment of the extent of an exceedence of an air quality standard and the degree of exposure this may represent. There is a wide range of available interpolation techniques and the differences in the characteristics of these result in variations in the output surfaces estimated from the same set of input points. The work presented in this paper provides an examination of uncertainties through the application of a number of interpolation techniques available in standard GIS packages to a case study nitrogen dioxide data set for the Greater Manchester conurbation in northern England. The implications of the use of different techniques are discussed through application to hourly concentrations during an air quality episode and annual average concentrations in 2001. Patterns of concentrations demonstrate considerable differences in the estimated spatial pattern of maxima as the combined effects of chemical processes, topography and meteorology. In the case of air quality episodes, the considerable spatial variability of concentrations results in large uncertainties in the surfaces produced but these uncertainties vary widely from area to area. In view of the uncertainties with classical techniques research is ongoing to develop alternative methods which should in time help improve the suite of tools available to air quality managers.

  10. Traceable Calibration, Performance Metrics, and Uncertainty Estimates of Minirhizotron Digital Imagery for Fine-Root Measurements

    PubMed Central

    Roberti, Joshua A.; SanClements, Michael D.; Loescher, Henry W.; Ayres, Edward

    2014-01-01

    Even though fine-root turnover is a highly studied topic, it is often poorly understood as a result of uncertainties inherent in its sampling, e.g., quantifying spatial and temporal variability. While many methods exist to quantify fine-root turnover, use of minirhizotrons has increased over the last two decades, making sensor errors another source of uncertainty. Currently, no standardized methodology exists to test and compare minirhizotron camera capability, imagery, and performance. This paper presents a reproducible, laboratory-based method by which minirhizotron cameras can be tested and validated in a traceable manner. The performance of camera characteristics was identified and test criteria were developed: we quantified the precision of camera location for successive images, estimated the trueness and precision of each camera's ability to quantify root diameter and root color, and also assessed the influence of heat dissipation introduced by the minirhizotron cameras and electrical components. We report detailed and defensible metrology analyses that examine the performance of two commercially available minirhizotron cameras. These cameras performed differently with regard to the various test criteria and uncertainty analyses. We recommend a defensible metrology approach to quantify the performance of minirhizotron camera characteristics and determine sensor-related measurement uncertainties prior to field use. This approach is also extensible to other digital imagery technologies. In turn, these approaches facilitate a greater understanding of measurement uncertainties (signal-to-noise ratio) inherent in the camera performance and allow such uncertainties to be quantified and mitigated so that estimates of fine-root turnover can be more confidently quantified. PMID:25391023

  11. New Capitalism, Risk, and Subjectification in an Early Childhood Classroom

    ERIC Educational Resources Information Center

    Bialostok, Steve; Kamberelis, George

    2010-01-01

    "New capitalism" has been characterized as an economic period in which insecurity, flux, and uncertainty exist in the workplace. Capitalism attempts to tame that uncertainty through risk taking. Taking risks has become what one must do with risk. Economic discourses of embracing risk--thoroughly grounded in the ideologies of neoliberalism--are…

  12. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 15: Technical uncertainty and project complexity as correlates of information use by US industry-affiliated aerospace engineers and scientists: Results of an exploratory investigation

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.

    1993-01-01

    An exploratory study was conducted that investigated the influence of technical uncertainty and project complexity on information use by U.S. industry-affiliated aerospace engineers and scientists. The study utilized survey research in the form of a self-administered mail questionnaire. U.S. aerospace engineers and scientists on the Society of Automotive Engineers (SAE) mailing list served as the study population. The adjusted response rate was 67 percent. The survey instrument is appendix C to this report. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and information use. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and the use of federally funded aerospace R&D. The results of this investigation are relevant to researchers investigating information-seeking behavior of aerospace engineers. They are also relevant to R&D managers and policy planners concerned with transferring the results of federally funded aerospace R&D to the U.S. aerospace industry.

  13. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  14. Error regions in quantum state tomography: computational complexity caused by geometry of quantum states

    NASA Astrophysics Data System (ADS)

    Suess, Daniel; Rudnicki, Łukasz; maciel, Thiago O.; Gross, David

    2017-09-01

    The outcomes of quantum mechanical measurements are inherently random. It is therefore necessary to develop stringent methods for quantifying the degree of statistical uncertainty about the results of quantum experiments. For the particularly relevant task of quantum state tomography, it has been shown that a significant reduction in uncertainty can be achieved by taking the positivity of quantum states into account. However—the large number of partial results and heuristics notwithstanding—no efficient general algorithm is known that produces an optimal uncertainty region from experimental data, while making use of the prior constraint of positivity. Here, we provide a precise formulation of this problem and show that the general case is NP-hard. Our result leaves room for the existence of efficient approximate solutions, and therefore does not in itself imply that the practical task of quantum uncertainty quantification is intractable. However, it does show that there exists a non-trivial trade-off between optimality and computational efficiency for error regions. We prove two versions of the result: one for frequentist and one for Bayesian statistics.

  15. Estimation and impact assessment of input and parameter uncertainty in predicting groundwater flow with a fully distributed model

    NASA Astrophysics Data System (ADS)

    Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke

    2017-04-01

    Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.

  16. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  17. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2011-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars entry vehicles. A survey was conducted of existing experimental heat-transfer and shock-shape data for high enthalpy, reacting-gas CO2 flows and five relevant test series were selected for comparison to predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared to these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  18. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2013-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars-entry vehicles. A survey was conducted of existing experimental heat transfer and shock-shape data for high-enthalpy reacting-gas CO2 flows, and five relevant test series were selected for comparison with predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared with these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  19. Decision analysis of shoreline protection under climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  20. Optimization and resilience in natural resources management

    USGS Publications Warehouse

    Williams, Byron K.; Johnson, Fred A.

    2015-01-01

    We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.

  1. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  2. The EEOC's New Equal Pay Act Guidelines.

    ERIC Educational Resources Information Center

    Greenlaw, Paul S.; Kohl, John P.

    1982-01-01

    Analyzes the new guidelines for enforcement of the Equal Pay Act and their implications for personnel management. Argues that there are key problem areas in the new regulations arising from considerable ambiguity and uncertainty about their interpretation. (SK)

  3. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  4. Human subjects concerns in ground based ECLSS testing - Managing uncertainty in closely recycled systems

    NASA Technical Reports Server (NTRS)

    Crump, William J.; Janik, Daniel S.; Thomas, L. Dale

    1990-01-01

    U.S. space missions have to this point used water either made on board or carried from earth and discarded after use. For Space Station Freedom, long duration life support will include air and water recycling using a series of physical-chemical subsystems. The Environmental Control and Life Support System (ECLSS) designed for this application must be tested extensively at all stages of hardware maturity. Human test subjects are required to conduct some of these tests, and the risks associated with the use of development hardware must be addressed. Federal guidelines for protection of human subjects require careful consideration of risks and potential benefits by an Institutional Review Board (IRB) before and during testing. This paper reviews the ethical principles guiding this consideration, details the problems and uncertainties inherent in current hardware testing, and presents an incremental approach to risk assessment for ECLSS testing.

  5. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  6. Optimal control problems of epidemic systems with parameter uncertainties: application to a malaria two-age-classes transmission model with asymptomatic carriers.

    PubMed

    Mwanga, Gasper G; Haario, Heikki; Capasso, Vicenzo

    2015-03-01

    The main scope of this paper is to study the optimal control practices of malaria, by discussing the implementation of a catalog of optimal control strategies in presence of parameter uncertainties, which is typical of infectious diseases data. In this study we focus on a deterministic mathematical model for the transmission of malaria, including in particular asymptomatic carriers and two age classes in the human population. A partial qualitative analysis of the relevant ODE system has been carried out, leading to a realistic threshold parameter. For the deterministic model under consideration, four possible control strategies have been analyzed: the use of Long-lasting treated mosquito nets, indoor residual spraying, screening and treatment of symptomatic and asymptomatic individuals. The numerical results show that using optimal control the disease can be brought to a stable disease free equilibrium when all four controls are used. The Incremental Cost-Effectiveness Ratio (ICER) for all possible combinations of the disease-control measures is determined. The numerical simulations of the optimal control in the presence of parameter uncertainty demonstrate the robustness of the optimal control: the main conclusions of the optimal control remain unchanged, even if inevitable variability remains in the control profiles. The results provide a promising framework for the designing of cost-effective strategies for disease controls with multiple interventions, even under considerable uncertainty of model parameters. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Provider Recommendations in the Face of Scientific Uncertainty: An Analysis of Audio-Recorded Discussions about Vitamin D.

    PubMed

    Tarn, Derjung M; Paterniti, Debora A; Wenger, Neil S

    2016-08-01

    Little is known about how providers communicate recommendations when scientific uncertainty exists. To compare provider recommendations to those in the scientific literature, with a focus on whether uncertainty was communicated. Qualitative (inductive systematic content analysis) and quantitative analysis of previously collected audio-recorded provider-patient office visits. Sixty-one providers and a socio-economically diverse convenience sample of 603 of their patients from outpatient community- and academic-based primary care, integrative medicine, and complementary and alternative medicine provider offices in Southern California. Comparison of provider information-giving about vitamin D to professional guidelines and scientific information for which conflicting recommendations or insufficient scientific evidence exists; certainty with which information was conveyed. Ninety-two (15.3 %) of 603 visit discussions touched upon issues related to vitamin D testing, management and benefits. Vitamin D deficiency screening was discussed with 23 (25 %) patients, the definition of vitamin D deficiency with 21 (22.8 %), the optimal range for vitamin D levels with 26 (28.3 %), vitamin D supplementation dosing with 50 (54.3 %), and benefits of supplementation with 46 (50 %). For each of the professional guidelines/scientific information examined, providers conveyed information that deviated from professional guidelines and the existing scientific evidence. Of 166 statements made about vitamin D in this study, providers conveyed 160 (96.4 %) with certainty, without mention of any equivocal or contradictory evidence in the scientific literature. No uncertainty was mentioned when vitamin D dosing was discussed, even when recommended dosing was higher than guideline recommendations. Providers convey the vast majority of information and recommendations about vitamin D with certainty, even though the scientific literature contains inconsistent recommendations and declarations of inadequate evidence. Not communicating uncertainty blurs the contrast between evidence-based recommendations and those without evidence. Providers should explore best practices for involving patients in decision-making by acknowledging the uncertainty behind their recommendations.

  8. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    PubMed

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Transfer of Satellite Rainfall Uncertainty from Gauged to Ungauged Regions at Regional and Seasonal Timescales

    NASA Technical Reports Server (NTRS)

    Tang, Ling; Hossain, Faisal; Huffman, George J.

    2010-01-01

    Hydrologists and other users need to know the uncertainty of the satellite rainfall data sets across the range of time/space scales over the whole domain of the data set. Here, uncertainty' refers to the general concept of the deviation' of an estimate from the reference (or ground truth) where the deviation may be defined in multiple ways. This uncertainty information can provide insight to the user on the realistic limits of utility, such as hydrologic predictability, that can be achieved with these satellite rainfall data sets. However, satellite rainfall uncertainty estimation requires ground validation (GV) precipitation data. On the other hand, satellite data will be most useful over regions that lack GV data, for example developing countries. This paper addresses the open issues for developing an appropriate uncertainty transfer scheme that can routinely estimate various uncertainty metrics across the globe by leveraging a combination of spatially-dense GV data and temporally sparse surrogate (or proxy) GV data, such as the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar and the Global Precipitation Measurement (GPM) mission Dual-Frequency Precipitation Radar. The TRMM Multi-satellite Precipitation Analysis (TMPA) products over the US spanning a record of 6 years are used as a representative example of satellite rainfall. It is shown that there exists a quantifiable spatial structure in the uncertainty of satellite data for spatial interpolation. Probabilistic analysis of sampling offered by the existing constellation of passive microwave sensors indicate that transfer of uncertainty for hydrologic applications may be effective at daily time scales or higher during the GPM era. Finally, a commonly used spatial interpolation technique (kriging), that leverages the spatial correlation of estimation uncertainty, is assessed at climatologic, seasonal, monthly and weekly timescales. It is found that the effectiveness of kriging is sensitive to the type of uncertainty metric, time scale of transfer and the density of GV data within the transfer domain. Transfer accuracy is lowest at weekly timescales with the error doubling from monthly to weekly.However, at very low GV data density (<20% of the domain), the transfer accuracy is too low to show any distinction as a function of the timescale of transfer.

  10. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  11. Classifying the Sizes of Explosive Eruptions using Tephra Deposits: The Advantages of a Numerical Inversion Approach

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L.; White, J.

    2015-12-01

    Explosive volcanic eruptions are often classified by deposit mass and eruption column height. How well are these eruption parameters determined in older deposits, and how well can we reduce uncertainty using robust numerical and statistical methods? We describe an efficient and effective inversion and uncertainty quantification approach for estimating eruption parameters given a dataset of tephra deposit thickness and granulometry. The inversion and uncertainty quantification is implemented using the open-source PEST++ code. Inversion with PEST++ can be used with a variety of forward models and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind-field parameterization. The combined inversion/uncertainty-quantification approach is applied to the 1992 eruption of Cerro Negro (Nicaragua), the 2011 Kirishima-Shinmoedake (Japan), and the 1913 Colima (Mexico) eruptions. These examples show that although eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind-field parameters, such as eruption column height. Supplementing the inversion dataset with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind-field parameters. We think the use of such robust models provides a better understanding of uncertainty in eruption parameters, and hence eruption classification, than is possible with more qualitative methods that are widely used.

  12. Horsetail matching: a flexible approach to optimization under uncertainty

    NASA Astrophysics Data System (ADS)

    Cook, L. W.; Jarrett, J. P.

    2018-04-01

    It is important to design engineering systems to be robust with respect to uncertainties in the design process. Often, this is done by considering statistical moments, but over-reliance on statistical moments when formulating a robust optimization can produce designs that are stochastically dominated by other feasible designs. This article instead proposes a formulation for optimization under uncertainty that minimizes the difference between a design's cumulative distribution function and a target. A standard target is proposed that produces stochastically non-dominated designs, but the formulation also offers enough flexibility to recover existing approaches for robust optimization. A numerical implementation is developed that employs kernels to give a differentiable objective function. The method is applied to algebraic test problems and a robust transonic airfoil design problem where it is compared to multi-objective, weighted-sum and density matching approaches to robust optimization; several advantages over these existing methods are demonstrated.

  13. Evaluation of harvest and information needs for North American sea ducks

    USGS Publications Warehouse

    Koneff, Mark D.; Zimmerman, Guthrie S.; Dwyer, Chris P.; Fleming, Kathleen K.; Padding, Paul I.; Devers, Patrick K.; Johnson, Fred A.; Runge, Michael C.; Roberts, Anthony J.

    2017-01-01

    Wildlife managers routinely seek to establish sustainable limits of sport harvest or other regulated forms of take while confronted with considerable uncertainty. A growing body of ecological research focuses on methods to describe and account for uncertainty in management decision-making and to prioritize research and monitoring investments to reduce the most influential uncertainties. We used simulation methods incorporating measures of demographic uncertainty to evaluate risk of overharvest and prioritize information needs for North American sea ducks (Tribe Mergini). Sea ducks are popular game birds in North America, yet they are poorly monitored and their population dynamics are poorly understood relative to other North American waterfowl. There have been few attempts to assess the sustainability of harvest of North American sea ducks, and no formal harvest strategy exists in the U.S. or Canada to guide management. The popularity of sea duck hunting, extended hunting opportunity for some populations (i.e., special seasons and/or bag limits), and population declines have led to concern about potential overharvest. We used Monte Carlo simulation to contrast estimates of allowable harvest and observed harvest and assess risk of overharvest for 7 populations of North American sea ducks: the American subspecies of common eider (Somateria mollissima dresseri), eastern and western populations of black scoter (Melanitta americana) and surf scoter (M. perspicillata), and continental populations of white-winged scoter (M. fusca) and long-tailed duck (Clangula hyemalis). We combined information from empirical studies and the opinions of experts through formal elicitation to create probability distributions reflecting uncertainty in the individual demographic parameters used in this assessment. Estimates of maximum growth (rmax), and therefore of allowable harvest, were highly uncertain for all populations. Long-tailed duck and American common eider appeared to be at high risk of overharvest (i.e., observed harvest < allowable harvest in 5–7% and 19–26% of simulations, respectively depending on the functional form of density dependence), whereas the other populations appeared to be at moderate risk to low risk (observed harvest < allowable harvest in 22–68% of simulations, again conditional on the form of density dependence). We also evaluated the sensitivity of the difference between allowable and observed harvest estimates to uncertainty in individual demographic parameters to prioritize information needs. We found that uncertainty in overall fecundity had more influence on comparisons of allowable and observed harvest than adult survival or observed harvest for all species except long-tailed duck. Although adult survival was characterized by less uncertainty than individual components of fecundity, it was identified as a high priority information need given the sensitivity of growth rate and allowable harvest to this parameter. Uncertainty about population size was influential in the comparison of observed and allowable harvest for 5 of the 6 populations where it factored into the assessment. While this assessment highlights a high degree of uncertainty in allowable harvest, it provides a framework for integration of improved data from future research and monitoring. It could also serve as the basis for harvest strategy development as management objectives and regulatory alternatives are specified by the management community.

  14. Evaluation of harvest and information needs for North American sea ducks.

    PubMed

    Koneff, Mark D; Zimmerman, Guthrie S; Dwyer, Chris P; Fleming, Kathleen K; Padding, Paul I; Devers, Patrick K; Johnson, Fred A; Runge, Michael C; Roberts, Anthony J

    2017-01-01

    Wildlife managers routinely seek to establish sustainable limits of sport harvest or other regulated forms of take while confronted with considerable uncertainty. A growing body of ecological research focuses on methods to describe and account for uncertainty in management decision-making and to prioritize research and monitoring investments to reduce the most influential uncertainties. We used simulation methods incorporating measures of demographic uncertainty to evaluate risk of overharvest and prioritize information needs for North American sea ducks (Tribe Mergini). Sea ducks are popular game birds in North America, yet they are poorly monitored and their population dynamics are poorly understood relative to other North American waterfowl. There have been few attempts to assess the sustainability of harvest of North American sea ducks, and no formal harvest strategy exists in the U.S. or Canada to guide management. The popularity of sea duck hunting, extended hunting opportunity for some populations (i.e., special seasons and/or bag limits), and population declines have led to concern about potential overharvest. We used Monte Carlo simulation to contrast estimates of allowable harvest and observed harvest and assess risk of overharvest for 7 populations of North American sea ducks: the American subspecies of common eider (Somateria mollissima dresseri), eastern and western populations of black scoter (Melanitta americana) and surf scoter (M. perspicillata), and continental populations of white-winged scoter (M. fusca) and long-tailed duck (Clangula hyemalis). We combined information from empirical studies and the opinions of experts through formal elicitation to create probability distributions reflecting uncertainty in the individual demographic parameters used in this assessment. Estimates of maximum growth (rmax), and therefore of allowable harvest, were highly uncertain for all populations. Long-tailed duck and American common eider appeared to be at high risk of overharvest (i.e., observed harvest < allowable harvest in 5-7% and 19-26% of simulations, respectively depending on the functional form of density dependence), whereas the other populations appeared to be at moderate risk to low risk (observed harvest < allowable harvest in 22-68% of simulations, again conditional on the form of density dependence). We also evaluated the sensitivity of the difference between allowable and observed harvest estimates to uncertainty in individual demographic parameters to prioritize information needs. We found that uncertainty in overall fecundity had more influence on comparisons of allowable and observed harvest than adult survival or observed harvest for all species except long-tailed duck. Although adult survival was characterized by less uncertainty than individual components of fecundity, it was identified as a high priority information need given the sensitivity of growth rate and allowable harvest to this parameter. Uncertainty about population size was influential in the comparison of observed and allowable harvest for 5 of the 6 populations where it factored into the assessment. While this assessment highlights a high degree of uncertainty in allowable harvest, it provides a framework for integration of improved data from future research and monitoring. It could also serve as the basis for harvest strategy development as management objectives and regulatory alternatives are specified by the management community.

  15. Evaluation of harvest and information needs for North American sea ducks

    PubMed Central

    Dwyer, Chris P.; Fleming, Kathleen K.; Padding, Paul I.; Devers, Patrick K.; Johnson, Fred A.; Runge, Michael C.; Roberts, Anthony J.

    2017-01-01

    Wildlife managers routinely seek to establish sustainable limits of sport harvest or other regulated forms of take while confronted with considerable uncertainty. A growing body of ecological research focuses on methods to describe and account for uncertainty in management decision-making and to prioritize research and monitoring investments to reduce the most influential uncertainties. We used simulation methods incorporating measures of demographic uncertainty to evaluate risk of overharvest and prioritize information needs for North American sea ducks (Tribe Mergini). Sea ducks are popular game birds in North America, yet they are poorly monitored and their population dynamics are poorly understood relative to other North American waterfowl. There have been few attempts to assess the sustainability of harvest of North American sea ducks, and no formal harvest strategy exists in the U.S. or Canada to guide management. The popularity of sea duck hunting, extended hunting opportunity for some populations (i.e., special seasons and/or bag limits), and population declines have led to concern about potential overharvest. We used Monte Carlo simulation to contrast estimates of allowable harvest and observed harvest and assess risk of overharvest for 7 populations of North American sea ducks: the American subspecies of common eider (Somateria mollissima dresseri), eastern and western populations of black scoter (Melanitta americana) and surf scoter (M. perspicillata), and continental populations of white-winged scoter (M. fusca) and long-tailed duck (Clangula hyemalis). We combined information from empirical studies and the opinions of experts through formal elicitation to create probability distributions reflecting uncertainty in the individual demographic parameters used in this assessment. Estimates of maximum growth (rmax), and therefore of allowable harvest, were highly uncertain for all populations. Long-tailed duck and American common eider appeared to be at high risk of overharvest (i.e., observed harvest < allowable harvest in 5–7% and 19–26% of simulations, respectively depending on the functional form of density dependence), whereas the other populations appeared to be at moderate risk to low risk (observed harvest < allowable harvest in 22–68% of simulations, again conditional on the form of density dependence). We also evaluated the sensitivity of the difference between allowable and observed harvest estimates to uncertainty in individual demographic parameters to prioritize information needs. We found that uncertainty in overall fecundity had more influence on comparisons of allowable and observed harvest than adult survival or observed harvest for all species except long-tailed duck. Although adult survival was characterized by less uncertainty than individual components of fecundity, it was identified as a high priority information need given the sensitivity of growth rate and allowable harvest to this parameter. Uncertainty about population size was influential in the comparison of observed and allowable harvest for 5 of the 6 populations where it factored into the assessment. While this assessment highlights a high degree of uncertainty in allowable harvest, it provides a framework for integration of improved data from future research and monitoring. It could also serve as the basis for harvest strategy development as management objectives and regulatory alternatives are specified by the management community. PMID:28419113

  16. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Botterud, Audun; Zhou, Zhi

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  17. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA determines the overall risk associated with a particular mission by factoring in all known risks to the spacecraft during its mission. The threat to mission and human life posed by the micro-meteoroid and orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the ISS. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. In this paper, we present possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the Shuttle and Station versions of BUMPER II.

  18. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE PAGES

    Liu, Cong; Botterud, Audun; Zhou, Zhi; ...

    2016-10-21

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  19. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  20. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  1. Using a Meniscus to Teach Uncertainty in Measurement

    NASA Astrophysics Data System (ADS)

    Backman, Philip

    2008-02-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know something about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is determined or calculated, it remains as only a number without a concrete physical connection back to the experiment. For the activity described here—presented as a challenge—groups of students are given a container and asked to make certain measurements and to estimate the uncertainty in each of those measurements. They are then challenged to complete a particular task involving the container and a volume of water. Whether the assigned task is actually achievable, however, slowly comes into question once the magnitude of the uncertainties in the original measurements is compared to the specific requirements of the challenge.

  2. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  3. A Practical Approach to Address Uncertainty in Stakeholder Deliberations.

    PubMed

    Gregory, Robin; Keeney, Ralph L

    2017-03-01

    This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.

  4. Illness uncertainty and treatment motivation in type 2 diabetes patients.

    PubMed

    Apóstolo, João Luís Alves; Viveiros, Catarina Sofia Castro; Nunes, Helena Isabel Ribeiro; Domingues, Helena Raquel Faustino

    2007-01-01

    To characterize the uncertainty in illness and the motivation for treatment and to evaluate the existing relation between these variables in individuals with type 2 diabetes. Descriptive, correlational study, using a sample of 62 individuals in diabetes consultation sessions. The Uncertainty Stress Scale and the Treatment Self-Regulation Questionnaire were used. The individuals with type 2 diabetes present low levels of uncertainty in illness and a high motivation for treatment, with a stronger intrinsic than extrinsic motivation. A negative correlation was verified between the uncertainty in the face of the prognosis and treatment and the intrinsic motivation. These individuals are already adapted, acting according to the meanings they attribute to illness. Uncertainty can function as a threat, intervening negatively in the attribution of meaning to the events related to illness and in the process of adaptation and motivation to adhere to treatment. Intrinsic motivation seems to be essential to adhere to treatment.

  5. Sampling for Chemical Analysis.

    ERIC Educational Resources Information Center

    Kratochvil, Byron; And Others

    1984-01-01

    This review, designed to make analysts aware of uncertainties introduced into analytical measurements during sampling, is organized under these headings: general considerations; theory; standards; and applications related to mineralogy, soils, sediments, metallurgy, atmosphere, water, biology, agriculture and food, medical and clinical areas, oil…

  6. POWER TO DETECT REGIONAL TRENDS IN HABITAT CHARACTERISTICS

    EPA Science Inventory

    The condition of stream habitat draws considerable attention concerning the protection and recovery of salmonid populations in the West. Habitat degradation continues and substantial sums of money are spent on habitat restoration. However, aided by uncertainty concerning the ad...

  7. POWER TO DETECT REGIONAL TRENDS IN PHYSICAL HABITAT

    EPA Science Inventory

    The condition of stream habitat draws considerable attention concerning the protection and recovery of salmonid populations in the West. Habitat degradation continues and substantial sums of money are spent on habitat restoration. However, aided by uncertainty concerning the ad...

  8. Quantum issues in optical communication. [noise reduction in signal reception

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.

    1973-01-01

    Various approaches to the problem of controlling quantum noise, the dominant noise in an optical communications system, are discussed. It is shown that, no matter which way the problem is approached, there always remain uncertainties. These uncertainties exist because, to date, only very few communication problems have been solved in their full quantum form.

  9. I Am Sure There May Be a Planet There: Student Articulation of Uncertainty in Argumentation Tasks

    ERIC Educational Resources Information Center

    Buck, Zoë E.; Lee, Hee-Sun; Flores, Joanna

    2014-01-01

    We investigated how students articulate uncertainty when they are engaged in structured scientific argumentation tasks where they generate, examine, and interpret data to determine the existence of exoplanets. In this study, 302 high school students completed 4 structured scientific arguments that followed a series of computer-model-based…

  10. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2015-05-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties along the entire causal chain. We estimate uncertainties in economic data, multi-pollutant emission statistics, and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. Based on our assumptions, which exclude correlations in the economic data, the uncertainty in the economic data appears to have a relatively small impact on uncertainty at the national level in comparison to emissions and metric uncertainty. Much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production-based emissions since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±10 to ±27 % using the Global Temperature Potential with a 50-year time horizon, with metric uncertainties dominating. National-level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9 to ±25 %, with metric and emission uncertainties contributing similarly. The absolute global temperature potential (AGTP) with a 50-year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  11. Sources of Uncertainty and the Interpretation of Short-Term Fluctuations

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Cowtan, K.; Rahmstorf, S.

    2016-12-01

    The alleged significant slowdown in global warming during the first decade of the 21st century, and the appearance of a discrepancy between models and observations, has attracted considerable research attention. We trace the history of this research and show how its conclusions were shaped by several sources of uncertainty and ambiguity about models and observations. We show that as those sources of uncertainty were gradually eliminated by further research, insufficient evidence remained to infer any discrepancy between models and observations or a significant slowing of warming. Specifically, we show that early research had to contend with uncertainties about coverage biases in the global temperature record and biases in the sea surface temperature observations which turned out to have exaggerated the extent of slowing. In addition, uncertainties in the observed forcings were found to have exaggerated the mismatch between models and observations. Further sources of uncertainty that were ultimately eliminated involved the use of incommensurate sea surface temperature data between models and observations and a tacit interpretation of model projections as predictions or forecasts. After all those sources of uncertainty were eliminated, the most recent research finds little evidence for an unusual slowdown or a discrepancy between models and observations. We discuss whether these different kinds of uncertainty could have been anticipated or managed differently, and how one can apply those lessons to future short-term fluctuations in warming.

  12. Cancer Risk Assessment for Space Radiation

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.; Cruz, Angela; Bors, Karen; Curreri, Peter A. (Technical Monitor)

    2001-01-01

    Predicting the occurrence of human cancer following exposure to any agent causing genetic damage is a difficult task. This is because the uncertainty of uniform exposure to the damaging agent, and the uncertainty of uniform processing of that damage within a complex set of biological variables, degrade the confidence of predicting the delayed expression of cancer as a relatively rare event within any given clinically normal individual. The radiation health research priorities for enabling long-duration human exploration of space were established in the 1996 NRC Report entitled 'Radiation Hazards to Crews of Interplanetary Missions: Biological Issues and Research Strategies'. This report emphasized that a 15-fold uncertainty in predicting radiation-induced cancer incidence must be reduced before NASA can commit humans to extended interplanetary missions. That report concluded that the great majority of this uncertainty is biologically based, while a minority is physically based due to uncertainties in radiation dosimetry and radiation transport codes. Since that report, the biologically based uncertainty has remained large, and the relatively small uncertainty associated with radiation dosimetry has increased due to the considerations raised by concepts of microdosimetry. In a practical sense, however, the additional uncertainties introduced by microdosimetry are encouraging since they are in a direction of lowered effective dose absorbed through infrequent interactions of any given cell with the high energy particle component of space radiation. Additional information is contained in the original extended abstract.

  13. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs

    NASA Astrophysics Data System (ADS)

    Bradford, Michael J.

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  14. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for themore » discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.« less

  15. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  16. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs.

    PubMed

    Bradford, Michael J

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  17. Cloud Condensation Nuclei Prediction Error from Application of Kohler Theory: Importance for the Aerosol Indirect Effect

    NASA Technical Reports Server (NTRS)

    Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.

    2007-01-01

    In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.

  18. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  19. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  20. Steering the measured uncertainty under decoherence through local PT -symmetric operations

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu

    2018-07-01

    The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.

  1. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    PubMed

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  2. Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.

    PubMed

    Djulbegovic, Benjamin

    2011-10-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.

  3. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation.

    PubMed

    Zarr, Robert R

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors.

  4. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation

    PubMed Central

    Zarr, Robert R.

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors. PMID:27134779

  5. Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics

    PubMed Central

    Djulbegovic, Benjamin

    2011-01-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885

  6. Evaluation strategies and uncertainty calculation of isotope amount ratios measured by MC ICP-MS on the example of Sr.

    PubMed

    Horsky, Monika; Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    This paper critically reviews the state-of-the-art of isotope amount ratio measurements by solution-based multi-collector inductively coupled plasma mass spectrometry (MC ICP-MS) and presents guidelines for corresponding data reduction strategies and uncertainty assessments based on the example of n((87)Sr)/n((86)Sr) isotope ratios. This ratio shows variation attributable to natural radiogenic processes and mass-dependent fractionation. The applied calibration strategies can display these differences. In addition, a proper statement of uncertainty of measurement, including all relevant influence quantities, is a metrological prerequisite. A detailed instructive procedure for the calculation of combined uncertainties is presented for Sr isotope amount ratios using three different strategies of correction for instrumental isotopic fractionation (IIF): traditional internal correction, standard-sample bracketing, and a combination of both, using Zr as internal standard. Uncertainties are quantified by means of a Kragten spreadsheet approach, including the consideration of correlations between individual input parameters to the model equation. The resulting uncertainties are compared with uncertainties obtained from the partial derivatives approach and Monte Carlo propagation of distributions. We obtain relative expanded uncertainties (U rel; k = 2) of n((87)Sr)/n((86)Sr) of < 0.03 %, when normalization values are not propagated. A comprehensive propagation, including certified values and the internal normalization ratio in nature, increases relative expanded uncertainties by about factor two and the correction for IIF becomes the major contributor.

  7. Robust root clustering for linear uncertain systems using generalized Lyapunov theory

    NASA Technical Reports Server (NTRS)

    Yedavalli, R. K.

    1993-01-01

    Consideration is given to the problem of matrix root clustering in subregions of a complex plane for linear state space models with real parameter uncertainty. The nominal matrix root clustering theory of Gutman & Jury (1981) using the generalized Liapunov equation is extended to the perturbed matrix case, and bounds are derived on the perturbation to maintain root clustering inside a given region. The theory makes it possible to obtain an explicit relationship between the parameters of the root clustering region and the uncertainty range of the parameter space.

  8. Improved entrance optic for global irradiance measurements with a Brewer spectrophotometer.

    PubMed

    Gröbner, Julian

    2003-06-20

    A new entrance optic for a Brewer spectrophotometer has been designed and tested both in the laboratory and during solar measurements. The integrated cosine response deviates by 2.4% from the ideal, with an uncertainty of +/- 1%. The systematic uncertainties of global solar irradiance measurements with this new entrance optic are considerably reduced compared with measurements with the traditional design. Simultaneous solar irradiance measurements between the Brewer spectrophotometer and a spectroradiometer equipped with a state-of-the-art shaped diffuser agreed to within +/- 2% during a five-day measurement period.

  9. Using discrete choice experiments within a cost-benefit analysis framework: some considerations.

    PubMed

    McIntosh, Emma

    2006-01-01

    A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.

  10. Projecting biodiversity and wood production in future forest landscapes: 15 key modeling considerations.

    PubMed

    Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika

    2017-07-15

    A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.

  12. Reynolds-Averaged Turbulence Model Assessment for a Highly Back-Pressured Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, Robert A.; Middleton, Troy F.; Wilson, L. G.

    2012-01-01

    The use of computational fluid dynamics in scramjet engine component development is widespread in the existing literature. Unfortunately, the quantification of model-form uncertainties is rarely addressed with anything other than sensitivity studies, requiring that the computational results be intimately tied to and calibrated against existing test data. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Due to ground test facility limitations, this expanded role is believed to be a requirement by some in the test and evaluation community if scramjet engines are to be given serious consideration as a viable propulsion device. An effort has been initiated at the NASA Langley Research Center to validate several turbulence closure models used for Reynolds-averaged simulations of scramjet isolator flows. The turbulence models considered were the Menter BSL, Menter SST, Wilcox 1998, Wilcox 2006, and the Gatski-Speziale explicit algebraic Reynolds stress models. The simulations were carried out using the VULCAN computational fluid dynamics package developed at the NASA Langley Research Center. A procedure to quantify the numerical errors was developed to account for discretization errors in the validation process. This procedure utilized the grid convergence index defined by Roache as a bounding estimate for the numerical error. The validation data was collected from a mechanically back-pressured constant area (1 2 inch) isolator model with an isolator entrance Mach number of 2.5. As expected, the model-form uncertainty was substantial for the shock-dominated, massively separated flowfield within the isolator as evidenced by a 6 duct height variation in shock train length depending on the turbulence model employed. Generally speaking, the turbulence models that did not include an explicit stress limiter more closely matched the measured surface pressures. This observation is somewhat surprising, given that stress-limiting models have generally been developed to better predict shock-separated flows. All of the models considered also failed to properly predict the shape and extent of the separated flow region caused by the shock boundary layer interactions. However, the best performing models were able to predict the isolator shock train length (an important metric for isolator operability margin) to within 1 isolator duct height.

  13. Adaptive integral robust control and application to electromechanical servo systems.

    PubMed

    Deng, Wenxiang; Yao, Jianyong

    2017-03-01

    This paper proposes a continuous adaptive integral robust control with robust integral of the sign of the error (RISE) feedback for a class of uncertain nonlinear systems, in which the RISE feedback gain is adapted online to ensure the robustness against disturbances without the prior bound knowledge of the additive disturbances. In addition, an adaptive compensation integrated with the proposed adaptive RISE feedback term is also constructed to further reduce design conservatism when the system also exists parametric uncertainties. Lyapunov analysis reveals the proposed controllers could guarantee the tracking errors are asymptotically converging to zero with continuous control efforts. To illustrate the high performance nature of the developed controllers, numerical simulations are provided. At the end, an application case of an actual electromechanical servo system driven by motor is also studied, with some specific design consideration, and comparative experimental results are obtained to verify the effectiveness of the proposed controllers. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Broken SU(3) antidecuplet for {Theta}{sup +} and {Xi}{sub 3/2}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pakvasa, Sandip; Suzuki, Mahiko

    2004-05-05

    If the narrow exotic baryon resonances {Theta}{sup +}(1540) and {Xi}{sub 3/2} are members of the J{sup P} = 1/2{sup +} antidecuplet with N*(1710), the octet-antidecuplet mixing is required not only by the mass spectrum but also by the decay pattern of N*(1710). This casts doubt on validity of the {Theta}{sup +} mass prediction by the chiral soliton model. While all pieces of the existing experimental information point to a small octet-decuplet mixing, the magnitude of mixing required by the mass spectrum is not consistent with the value needed to account for the hadronic decay rates. The discrepancy is not resolvedmore » even after the large experimental uncertainty is taken into consideration. We fail to find an alternative SU(3) assignment even with different spin-parity assignment. When we extend the analysis to mixing with a higher SU(3) multiplet, we find one experimentally testable scenario in the case of mixing with a 27-plet.« less

  15. Three-dimensional phonon population anisotropy in silicon nanomembranes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McElhinny, Kyle M.; Gopalakrishnan, Gokul; Holt, Martin V.

    Nanoscale single crystals possess modified phonon dispersions due to the truncation of the crystal. The introduction of surfaces alters the population of phonons relative to the bulk and introduces anisotropy arising from the breaking of translational symmetry. Such modifications exist throughout the Brillouin zone, even in structures with dimensions of several nanometers, posing a challenge to the characterization of vibrational properties and leading to uncertainty in predicting the thermal, optical, and electronic properties of nanomaterials. Synchrotron x-ray thermal diffuse scattering studies find that freestanding Si nanomembranes with thicknesses as large as 21 nm exhibit a higher scattering intensity per unitmore » thickness than bulk silicon. In addition, the anisotropy arising from the finite thickness of these membranes produces particularly intense scattering along reciprocal-space directions normal to the membrane surface compared to corresponding in-plane directions. These results reveal the dimensions at which calculations of materials properties and device characteristics based on bulk phonon dispersions require consideration of the nanoscale size of the crystal.« less

  16. The new US health care plan of 1993 and its terminology.

    PubMed

    Wilson, C N

    1993-10-01

    The Clinton Administration is moving toward a fundamental change in the United States health care system. President Bill Clinton made the issue of health care reform one of his top campaign priorities and promised to introduce a reform proposal for consideration in 1993. To that end, he asked his wife, Hillary Clinton, to lead the newly established Health Care Reform Task Force as it develops alternatives and the plan for health reform as promised to be introduced in 1993. While some uncertainty exists regarding how President Clinton will differ from candidate Clinton, current indications are that the health care reform will be somewhere between "incremental reform' which the opposition Republican Party favours, and the 'Single Payor/Canadian Style' approaches under which the federal government would take over most responsibilities now carried out by private health insurance companies. The purpose of this paper is to present an overview of the health reform plan, and to present the specialised terminology that is growing from the health reform initiatives.

  17. Exploring the Climate Change, Migration and Conflict Nexus.

    PubMed

    Burrows, Kate; Kinney, Patrick L

    2016-04-22

    The potential link between climate change, migration, and conflict has been widely discussed and is increasingly viewed by policy makers as a security issue. However, considerable uncertainty remains regarding the role that climate variability and change play among the many drivers of migration and conflict. The overall objective of this paper is to explore the potential pathways linking climate change, migration and increased risk of conflict. We review the existing literature surrounding this issue and break the problem into two components: the links between climate change and migration, and those between migration and conflict. We found a large range of views regarding the importance of climate change as a driver for increasing rates of migration and subsequently of conflict. We argue that future research should focus not only on the climate-migration-conflict pathway but also work to understand the other pathways by which climate variability and change might exacerbate conflict. We conclude by proposing five questions to help guide future research on the link between climate change, migration, and conflict.

  18. Coma and vegetative states: state of the art and proposal of a novel approach combining existing coma scales.

    PubMed

    Bonsignore, Luca Tommaso; Macrì, Simone; Orsi, Paolo; Chiarotti, Flavia; Alleva, Enrico

    2014-01-01

    Brain damage of various aetiologies can lead to different disorders of consciousness (DOC), varying from coma to vegetative, to minimally conscious states. Each state is characterised by a different degree of wakefulness, awareness, pain sensitivity and is differentially handled with respect to treatment, ethical considerations and end-oflife decisions. Thus, its correct identification is crucial while devising or modulating appropriate treatment strategies. Actually, the main coma scales cannot always accurately determine the state of consciousness of an individual, while other tools (e.g. imaging techniques) present a certain degree of uncertainty. A complementary approach may be constituted by a 24-hour observation of patients, for a sufficient period of days, using an ad hoc behavioural scale, further correlated with physiological and pharmacological parameters measured on patients. The method herein described might help recognising the presence of consciousness of the different DOC patients, and thus discerning a vegetative from a minimally conscious state.

  19. Exploring the Climate Change, Migration and Conflict Nexus

    PubMed Central

    Burrows, Kate; Kinney, Patrick L.

    2016-01-01

    The potential link between climate change, migration, and conflict has been widely discussed and is increasingly viewed by policy makers as a security issue. However, considerable uncertainty remains regarding the role that climate variability and change play among the many drivers of migration and conflict. The overall objective of this paper is to explore the potential pathways linking climate change, migration and increased risk of conflict. We review the existing literature surrounding this issue and break the problem into two components: the links between climate change and migration, and those between migration and conflict. We found a large range of views regarding the importance of climate change as a driver for increasing rates of migration and subsequently of conflict. We argue that future research should focus not only on the climate-migration-conflict pathway but also work to understand the other pathways by which climate variability and change might exacerbate conflict. We conclude by proposing five questions to help guide future research on the link between climate change, migration, and conflict. PMID:27110806

  20. Supply network configuration—A benchmarking problem

    NASA Astrophysics Data System (ADS)

    Brandenburg, Marcus

    2018-03-01

    Managing supply networks is a highly relevant task that strongly influences the competitiveness of firms from various industries. Designing supply networks is a strategic process that considerably affects the structure of the whole network. In contrast, supply networks for new products are configured without major adaptations of the existing structure, but the network has to be configured before the new product is actually launched in the marketplace. Due to dynamics and uncertainties, the resulting planning problem is highly complex. However, formal models and solution approaches that support supply network configuration decisions for new products are scant. The paper at hand aims at stimulating related model-based research. To formulate mathematical models and solution procedures, a benchmarking problem is introduced which is derived from a case study of a cosmetics manufacturer. Tasks, objectives, and constraints of the problem are described in great detail and numerical values and ranges of all problem parameters are given. In addition, several directions for future research are suggested.

  1. Cirrus clouds and climate feedback: Is the sky falling and should we go tell the king

    NASA Technical Reports Server (NTRS)

    Stephens, Graeme L.

    1990-01-01

    It is widely believed that thin cirrus clouds act to enhance the greenhouse effect owing to a particular combination of their optical properties. It is demonstrated how this effect is perhaps based on inadequate resolution of the physics of cirrus clouds and that the more likely impact of cirrus clouds to climate change remains somewhat elusive. These conclusions are developed within the context of a specific feedback mechanism incorporated into a simple mechanistic climate model. A specific scientific question addressed is whether or not the observed relationship between the ice water content and temperature of cirrus provides any significant feedback to the CO2 greenhouse warming. A related question also examined concerns the specific role of cloud microphysics and radiation in this feedback. This raises several pertinent issues about the understanding of cirrus clouds and their likely role in climate change as there presently exists a considerable uncertainty about the microphysics of these clouds (size and shape of ice crystals) and their radiative influences.

  2. Interpolation Method Needed for Numerical Uncertainty Analysis of Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Groves, Curtis; Ilie, Marcel; Schallhorn, Paul

    2014-01-01

    Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors in an unstructured grid, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors. Nomenclature

  3. International application of sugar-sweetened beverage (SSB) taxation in obesity reduction: factors that may influence policy effectiveness in country-specific contexts.

    PubMed

    Jou, Judy; Techakehakij, Win

    2012-09-01

    Sugar-sweetened beverage (SSB) taxation is becoming of increasing interest as a policy aimed at addressing the rising prevalence of obesity in many countries. Preliminary evidence indicates its potential to not only reduce obesity prevalence, but also generate public revenue. However, differences in country-specific contexts create uncertainties in its possible outcomes. This paper urges careful consideration of country-specific characteristics by suggesting three points in particular that may influence the effectiveness of a volume-based soft drink excise tax: population obesity prevalence, soft drink consumption levels, and existing baseline tax rates. Data from 19 countries are compared with regard to each point. The authors suggest that SSB or soft drink taxation policy may be more effective in reducing obesity prevalence where existing obesity prevalence and soft drink consumption levels are high. Conversely, in countries where the baseline tax rate is already considered high, SSB taxation may not have a noticeable impact on consumption patterns or obesity prevalence, and may incur negative feedback from the beverage industry or the general public. Thorough evaluation of these points is recommended prior to adopting SSB or soft drink taxation as an obesity reduction measure in any given country. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. On the Relationship between Observed NLDN Lightning ...

    EPA Pesticide Factsheets

    Lightning-produced nitrogen oxides (NOX=NO+NO2) in the middle and upper troposphere play an essential role in the production of ozone (O3) and influence the oxidizing capacity of the troposphere. Despite much effort in both observing and modeling lightning NOX during the past decade, considerable uncertainties still exist with the quantification of lightning NOX production and distribution in the troposphere. It is even more challenging for regional chemistry and transport models to accurately parameterize lightning NOX production and distribution in time and space. The Community Multiscale Air Quality Model (CMAQ) parameterizes the lightning NO emissions using local scaling factors adjusted by the convective precipitation rate that is predicted by the upstream meteorological model; the adjustment is based on the observed lightning strikes from the National Lightning Detection Network (NLDN). For this parameterization to be valid, the existence of an a priori reasonable relationship between the observed lightning strikes and the modeled convective precipitation rates is needed. In this study, we will present an analysis leveraged on the observed NLDN lightning strikes and CMAQ model simulations over the continental United States for a time period spanning over a decade. Based on the analysis, new parameterization scheme for lightning NOX will be proposed and the results will be evaluated. The proposed scheme will be beneficial to modeling exercises where the obs

  5. Bulk electric system reliability evaluation incorporating wind power and demand side management

    NASA Astrophysics Data System (ADS)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed correlations and the interactive effects of wind power and load forecast uncertainty on system reliability are examined. The concept of the security cost associated with operating in the marginal state in the well-being framework is incorporated in the economic analyses associated with system expansion planning including wind power and load forecast uncertainty. Overall reliability cost/worth analyses including security cost concepts are applied to select an optimal wind power injection strategy in a bulk electric system. The effects of the various demand side management measures on system reliability are illustrated using the system, load point, and well-being indices, and the reliability index probability distributions. The reliability effects of demand side management procedures in a bulk electric system including wind power and load forecast uncertainty considerations are also investigated. The system reliability effects due to specific demand side management programs are quantified and examined in terms of their reliability benefits.

  6. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  7. Bias Characterization in Probabilistic Genotype Data and Improved Signal Detection with Multiple Imputation

    PubMed Central

    Palmer, Cameron; Pe’er, Itsik

    2016-01-01

    Missing data are an unavoidable component of modern statistical genetics. Different array or sequencing technologies cover different single nucleotide polymorphisms (SNPs), leading to a complicated mosaic pattern of missingness where both individual genotypes and entire SNPs are sporadically absent. Such missing data patterns cannot be ignored without introducing bias, yet cannot be inferred exclusively from nonmissing data. In genome-wide association studies, the accepted solution to missingness is to impute missing data using external reference haplotypes. The resulting probabilistic genotypes may be analyzed in the place of genotype calls. A general-purpose paradigm, called Multiple Imputation (MI), is known to model uncertainty in many contexts, yet it is not widely used in association studies. Here, we undertake a systematic evaluation of existing imputed data analysis methods and MI. We characterize biases related to uncertainty in association studies, and find that bias is introduced both at the imputation level, when imputation algorithms generate inconsistent genotype probabilities, and at the association level, when analysis methods inadequately model genotype uncertainty. We find that MI performs at least as well as existing methods or in some cases much better, and provides a straightforward paradigm for adapting existing genotype association methods to uncertain data. PMID:27310603

  8. Optimization of vibratory energy harvesters with stochastic parametric uncertainty: a new perspective

    NASA Astrophysics Data System (ADS)

    Haji Hosseinloo, Ashkan; Turitsyn, Konstantin

    2016-04-01

    Vibration energy harvesting has been shown as a promising power source for many small-scale applications mainly because of the considerable reduction in the energy consumption of the electronics and scalability issues of the conventional batteries. However, energy harvesters may not be as robust as the conventional batteries and their performance could drastically deteriorate in the presence of uncertainty in their parameters. Hence, study of uncertainty propagation and optimization under uncertainty is essential for proper and robust performance of harvesters in practice. While all studies have focused on expectation optimization, we propose a new and more practical optimization perspective; optimization for the worst-case (minimum) power. We formulate the problem in a generic fashion and as a simple example apply it to a linear piezoelectric energy harvester. We study the effect of parametric uncertainty in its natural frequency, load resistance, and electromechanical coupling coefficient on its worst-case power and then optimize for it under different confidence levels. The results show that there is a significant improvement in the worst-case power of thus designed harvester compared to that of a naively-optimized (deterministically-optimized) harvester.

  9. Estimating the spatial distribution of wintering little brown bat populations in the eastern United States

    USGS Publications Warehouse

    Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,

    2014-01-01

    Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.

  10. The effects of geometric uncertainties on computational modelling of knee biomechanics

    PubMed Central

    Fisher, John; Wilcox, Ruth

    2017-01-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models. PMID:28879008

  11. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  12. An approximately Bayesian delta-rule model explains the dynamics of belief updating in a changing environment.

    PubMed

    Nassar, Matthew R; Wilson, Robert C; Heasly, Benjamin; Gold, Joshua I

    2010-09-15

    Maintaining appropriate beliefs about variables needed for effective decision making can be difficult in a dynamic environment. One key issue is the amount of influence that unexpected outcomes should have on existing beliefs. In general, outcomes that are unexpected because of a fundamental change in the environment should carry more influence than outcomes that are unexpected because of persistent environmental stochasticity. Here we use a novel task to characterize how well human subjects follow these principles under a range of conditions. We show that the influence of an outcome depends on both the error made in predicting that outcome and the number of similar outcomes experienced previously. We also show that the exact nature of these tendencies varies considerably across subjects. Finally, we show that these patterns of behavior are consistent with a computationally simple reduction of an ideal-observer model. The model adjusts the influence of newly experienced outcomes according to ongoing estimates of uncertainty and the probability of a fundamental change in the process by which outcomes are generated. A prior that quantifies the expected frequency of such environmental changes accounts for individual variability, including a positive relationship between subjective certainty and the degree to which new information influences existing beliefs. The results suggest that the brain adaptively regulates the influence of decision outcomes on existing beliefs using straightforward updating rules that take into account both recent outcomes and prior expectations about higher-order environmental structure.

  13. Is Recent Warming Unprecedented in the Common Era? Insights from PAGES2k data and the Last Millennium Reanalysis

    NASA Astrophysics Data System (ADS)

    Erb, M. P.; Emile-Geay, J.; McKay, N.; Hakim, G. J.; Steig, E. J.; Anchukaitis, K. J.

    2017-12-01

    Paleoclimate observations provide a critical context for 20th century warming by putting recent climate change into a longer-term perspective. Previous work (e.g. IPCC AR3-5) has claimed that recent decades are exceptional in the context of past centuries, though these statements are usually accompanied by large uncertainties and little spatial detail. Here we leverage a recent multiproxy compilation (PAGES2k Consortium, 2017) to revisit this long-standing question. We do so via two complementary approaches. The first approach compares multi-decadal averages and trends in PAGES2k proxy records, which include trees, corals, ice cores, and more. Numerous proxy records reveal that late 20th century values are extreme compared to the remainder of the recorded period, although considerable variability exists in the signals preserved in individual records. The second approach uses the same PAGES2k data blended with climate model output to produce an optimal analysis: the Last Millennium Reanalysis (LMR; Hakim et al., 2016). Unlike proxy data, LMR is spatially-complete and explicitly models uncertainty in proxy records, resulting in objective error estimates. The LMR results show that for nearly every region of the world, late 20th century temperatures exceed temperatures in previous multi-decadal periods during the Common Era, and 20th century warming rates exceed rates in previous centuries. An uncertainty with the present analyses concerns the interpretation of proxy records. PAGES2k included only records that are primarily sensitive to temperature, but many proxies may be influenced by secondary non-temperature effects. Additionally, the issue of seasonality is important as, for example, many temperature-sensitive tree ring chronologies in the Northern Hemisphere respond to summer or growing season temperature rather than annual-means. These uncertainties will be further explored. References Hakim, G. J., et al., 2016: The last millennium climate reanalysis project: Framework and first results. Journal of Geophysical Research: Atmospheres, 121(12), 6745-6764. http://doi.org/10.1002/2016JD024751 PAGES2k Consortium, 2017: A global multiproxy database for temperature reconstructions of the Common Era. Scientific Data, 1-33. http://doi.org/10.1038/sdata.2017.88

  14. An Extension to Deng's Entropy in the Open World Assumption with an Application in Sensor Data Fusion.

    PubMed

    Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S

    2018-06-11

    Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  15. Human pursuance of equality hinges on mental processes of projecting oneself into the perspectives of others and into future situations.

    PubMed

    Takesue, Hirofumi; Miyauchi, Carlos Makoto; Sakaiya, Shiro; Fan, Hongwei; Matsuda, Tetsuya; Kato, Junko

    2017-07-19

    In the pursuance of equality, behavioural scientists disagree about distinct motivators, that is, consideration of others and prospective calculation for oneself. However, accumulating data suggest that these motivators may share a common process in the brain whereby perspectives and events that did not arise in the immediate environment are conceived. To examine this, we devised a game imitating a real decision-making situation regarding redistribution among income classes in a welfare state. The neural correlates of redistributive decisions were examined under contrasting conditions, with and without uncertainty, which affects support for equality in society. The dorsal anterior cingulate cortex (dACC) and the caudate nucleus were activated by equality decisions with uncertainty but by selfless decisions without uncertainty. Activation was also correlated with subjective values. Activation in both the dACC and the caudate nucleus was associated with the attitude to prefer accordance with others, whereas activation in the caudate nucleus reflected that the expected reward involved the prospective calculation of relative income. The neural correlates suggest that consideration of others and prospective calculation for oneself may underlie the support for equality. Projecting oneself into the perspective of others and into prospective future situations may underpin the pursuance of equality.

  16. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2015-04-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  17. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2014-12-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  18. Uncertainty during breast diagnostic evaluation: state of the science.

    PubMed

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  19. Land Resources Allocation Strategies in an Urban Area Involving Uncertainty: A Case Study of Suzhou, in the Yangtze River Delta of China

    NASA Astrophysics Data System (ADS)

    Lu, Shasha; Guan, Xingliang; Zhou, Min; Wang, Yang

    2014-05-01

    A large number of mathematical models have been developed to support land resource allocation decisions and land management needs; however, few of them can address various uncertainties that exist in relation to many factors presented in such decisions (e.g., land resource availabilities, land demands, land-use patterns, and social demands, as well as ecological requirements). In this study, a multi-objective interval-stochastic land resource allocation model (MOISLAM) was developed for tackling uncertainty that presents as discrete intervals and/or probability distributions. The developed model improves upon the existing multi-objective programming and inexact optimization approaches. The MOISLAM not only considers economic factors, but also involves food security and eco-environmental constraints; it can, therefore, effectively reflect various interrelations among different aspects in a land resource management system. Moreover, the model can also help examine the reliability of satisfying (or the risk of violating) system constraints under uncertainty. In this study, the MOISLAM was applied to a real case of long-term urban land resource allocation planning in Suzhou, in the Yangtze River Delta of China. Interval solutions associated with different risk levels of constraint violation were obtained. The results are considered useful for generating a range of decision alternatives under various system conditions, and thus helping decision makers to identify a desirable land resource allocation strategy under uncertainty.

  20. Adaptive Control for Microgravity Vibration Isolation System

    NASA Technical Reports Server (NTRS)

    Yang, Bong-Jun; Calise, Anthony J.; Craig, James I.; Whorton, Mark S.

    2005-01-01

    Most active vibration isolation systems that try to a provide quiescent acceleration environment for space science experiments have utilized linear design methods. In this paper, we address adaptive control augmentation of an existing classical controller that employs a high-gain acceleration feedback together with a low-gain position feedback to center the isolated platform. The control design feature includes parametric and dynamic uncertainties because the hardware of the isolation system is built as a payload-level isolator, and the acceleration Sensor exhibits a significant bias. A neural network is incorporated to adaptively compensate for the system uncertainties, and a high-pass filter is introduced to mitigate the effect of the measurement bias. Simulations show that the adaptive control improves the performance of the existing acceleration controller and keep the level of the isolated platform deviation to that of the existing control system.

  1. Data-Constrained Projections of Methane Fluxes in a Northern Minnesota Peatland in Response to Elevated CO2 and Warming

    Treesearch

    Shuang Ma; Jiang Jiang; Yuanyuan Huang; Zheng Shi; Rachel M. Wilson; Daniel Ricciuto; Stephen D. Sebestyen; Paul J. Hanson; Yiqi Luo

    2017-01-01

    Large uncertainties exist in predicting responses of wetland methane (CH4) fluxes to future climate change. However, sources of the uncertainty have not been clearly identified despite the fact that methane production and emission processes have been extensively explored. In this study, we took advantage of manual CH4 flux...

  2. Reader Reaction On the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty

    PubMed Central

    Wu, Baolin; Guan, Weihua

    2015-01-01

    Summary Acar and Sun (2013, Biometrics, 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model. PMID:25351417

  3. Reader reaction on the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty.

    PubMed

    Wu, Baolin; Guan, Weihua

    2015-06-01

    Acar and Sun (2013, Biometrics 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model. © 2014, The International Biometric Society.

  4. Evaluation of assigned-value uncertainty for complex calibrator value assignment processes: a prealbumin example.

    PubMed

    Middleton, John; Vaks, Jeffrey E

    2007-04-01

    Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.

  5. Solar ultraviolet radiation and ozone depletion-driven climate change: effects on terrestrial ecosystems.

    PubMed

    Bornman, J F; Barnes, P W; Robinson, S A; Ballaré, C L; Flint, S D; Caldwell, M M

    2015-01-01

    In this assessment we summarise advances in our knowledge of how UV-B radiation (280-315 nm), together with other climate change factors, influence terrestrial organisms and ecosystems. We identify key uncertainties and knowledge gaps that limit our ability to fully evaluate the interactive effects of ozone depletion and climate change on these systems. We also evaluate the biological consequences of the way in which stratospheric ozone depletion has contributed to climate change in the Southern Hemisphere. Since the last assessment, several new findings or insights have emerged or been strengthened. These include: (1) the increasing recognition that UV-B radiation has specific regulatory roles in plant growth and development that in turn can have beneficial consequences for plant productivity via effects on plant hardiness, enhanced plant resistance to herbivores and pathogens, and improved quality of agricultural products with subsequent implications for food security; (2) UV-B radiation together with UV-A (315-400 nm) and visible (400-700 nm) radiation are significant drivers of decomposition of plant litter in globally important arid and semi-arid ecosystems, such as grasslands and deserts. This occurs through the process of photodegradation, which has implications for nutrient cycling and carbon storage, although considerable uncertainty exists in quantifying its regional and global biogeochemical significance; (3) UV radiation can contribute to climate change via its stimulation of volatile organic compounds from plants, plant litter and soils, although the magnitude, rates and spatial patterns of these emissions remain highly uncertain at present. UV-induced release of carbon from plant litter and soils may also contribute to global warming; and (4) depletion of ozone in the Southern Hemisphere modifies climate directly via effects on seasonal weather patterns (precipitation and wind) and these in turn have been linked to changes in the growth of plants across the Southern Hemisphere. Such research has broadened our understanding of the linkages that exist between the effects of ozone depletion, UV-B radiation and climate change on terrestrial ecosystems.

  6. Verification of Internal Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous phantoms, such as the MIRD phantom and its physical representation, Mr. ADAM. The results indicated that the Reciprocity Theorem is valid within an average range of uncertainty of 8%.

  7. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally associated with such return periods. Sensitivity results show that the impact of climate change is not as great for flood peaks with higher return periods. The average width of the uncertainty range and the size of the range for each catchment reveals that the uncertainties in low frequency events are greater than high frequency events. In addition, the uncertainty interval, estimated as the average width of the uncertainty range of flow for the five return periods, grows wider with a decrease in the runoff coefficient and wetness index of each catchment, both of which tend to increase the nonlinearity in the rainfall response. A key management question that emerges is the acceptability of residual risk where high exposure of vulnerable populations and/or critical infrastructure coincide with high costs of additional capacity in safety margins.

  8. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: Application to a smoking cessation trial

    PubMed Central

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald

    2014-01-01

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315

  9. Continuum topology optimization considering uncertainties in load locations based on the cloud model

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wen, Guilin

    2018-06-01

    Few researchers have paid attention to designing structures in consideration of uncertainties in the loading locations, which may significantly influence the structural performance. In this work, cloud models are employed to depict the uncertainties in the loading locations. A robust algorithm is developed in the context of minimizing the expectation of the structural compliance, while conforming to a material volume constraint. To guarantee optimal solutions, sufficient cloud drops are used, which in turn leads to low efficiency. An innovative strategy is then implemented to enormously improve the computational efficiency. A modified soft-kill bi-directional evolutionary structural optimization method using derived sensitivity numbers is used to output the robust novel configurations. Several numerical examples are presented to demonstrate the effectiveness and efficiency of the proposed algorithm.

  10. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. Traceable Dynamic Calibration of Force Transducers by Primary Means

    PubMed Central

    Vlajic, Nicholas; Chijioke, Ako

    2018-01-01

    We describe an apparatus for traceable, dynamic calibration of force transducers using harmonic excitation, and report calibration measurements of force transducers using this apparatus. In this system, the force applied to the transducer is produced by the acceleration of an attached mass, and is determined according to Newton’s second law, F = ma. The acceleration is measured by primary means, using laser interferometry. The capabilities of this system are demonstrated by performing dynamic calibrations of two shear-web-type force transducers up to a frequency of 2 kHz, with an expanded uncertainty below 1.2 %. We give an accounting of all significant sources of uncertainty, including a detailed consideration of the effects of dynamic tilting (rocking), which is a leading source of uncertainty in such harmonic force calibration systems. PMID:29887643

  12. Paleozoic shale gas resources in the Sichuan Basin, China

    USGS Publications Warehouse

    Potter, Christopher J.

    2018-01-01

    The Sichuan Basin, China, is commonly considered to contain the world’s most abundant shale gas resources. Although its Paleozoic marine shales share many basic characteristics with successful United States gas shales, numerous geologic uncertainties exist, and Sichuan Basin shale gas production is nascent. Gas retention was likely compromised by the age of the shale reservoirs, multiple uplifts and orogenies, and migration pathways along unconformities. High thermal maturities raise questions about gas storage potential in lower Paleozoic shales. Given these uncertainties, a new look at Sichuan Basin shale gas resources is advantageous. As part of a systematic effort to quantitatively assess continuous oil and gas resources in priority basins worldwide, the US Geological Survey (USGS) completed an assessment of Paleozoic shale gas in the Sichuan Basin in 2015. Three organic-rich marine Paleozoic shale intervals meet the USGS geologic criteria for quantitative assessment of shale gas resources: the lower Cambrian Qiongzhusi Formation, the uppermost Ordovician Wufeng through lowermost Silurian Longmaxi Formations (currently producing shale gas), and the upper Permian Longtan and Dalong Formations. This study defined geologically based assessment units and calculated probabilistic distributions of technically recoverable shale gas resources using the USGS well productivity–based method. For six assessment units evaluated in 2015, the USGS estimated a mean value of 23.9 tcf (677 billion cubic meters) of undiscovered, technically recoverable shale gas. This result is considerably lower than volumes calculated in previous shale gas assessments of the Sichuan Basin, highlighting a need for caution in this geologically challenging setting.

  13. Iterative near-term ecological forecasting: Needs, opportunities, and challenges

    USGS Publications Warehouse

    Dietze, Michael C.; Fox, Andrew; Beck-Johnson, Lindsay; Betancourt, Julio L.; Hooten, Mevin B.; Jarnevich, Catherine S.; Keitt, Timothy H.; Kenney, Melissa A.; Laney, Christine M.; Larsen, Laurel G.; Loescher, Henry W.; Lunch, Claire K.; Pijanowski, Bryan; Randerson, James T.; Read, Emily; Tredennick, Andrew T.; Vargas, Rodrigo; Weathers, Kathleen C.; White, Ethan P.

    2018-01-01

    Two foundational questions about sustainability are “How are ecosystems and the services they provide going to change in the future?” and “How do human decisions affect these trajectories?” Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.

  14. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943

  15. Iterative near-term ecological forecasting: Needs, opportunities, and challenges.

    PubMed

    Dietze, Michael C; Fox, Andrew; Beck-Johnson, Lindsay M; Betancourt, Julio L; Hooten, Mevin B; Jarnevich, Catherine S; Keitt, Timothy H; Kenney, Melissa A; Laney, Christine M; Larsen, Laurel G; Loescher, Henry W; Lunch, Claire K; Pijanowski, Bryan C; Randerson, James T; Read, Emily K; Tredennick, Andrew T; Vargas, Rodrigo; Weathers, Kathleen C; White, Ethan P

    2018-02-13

    Two foundational questions about sustainability are "How are ecosystems and the services they provide going to change in the future?" and "How do human decisions affect these trajectories?" Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.

  16. Developing an evidence-based methodological framework to systematically compare HTA coverage decisions: A mixed methods study.

    PubMed

    Nicod, Elena; Kanavos, Panos

    2016-01-01

    Health Technology Assessment (HTA) often results in different coverage recommendations across countries for a same medicine despite similar methodological approaches. This paper develops and pilots a methodological framework that systematically identifies the reasons for these differences using an exploratory sequential mixed methods research design. The study countries were England, Scotland, Sweden and France. The methodological framework was built around three stages of the HTA process: (a) evidence, (b) its interpretation, and (c) its influence on the final recommendation; and was applied to two orphan medicinal products. The criteria accounted for at each stage were qualitatively analyzed through thematic analysis. Piloting the framework for two medicines, eight trials, 43 clinical endpoints and seven economic models were coded 155 times. Eighteen different uncertainties about this evidence were coded 28 times, 56% of which pertained to evidence commonly appraised and 44% to evidence considered by only some agencies. The poor agreement in interpreting this evidence (κ=0.183) was partly explained by stakeholder input (ns=48 times), or by agency-specific risk (nu=28 uncertainties) and value preferences (noc=62 "other considerations"), derived through correspondence analysis. Accounting for variability at each stage of the process can be achieved by codifying its existence and quantifying its impact through the application of this framework. The transferability of this framework to other disease areas, medicines and countries is ensured by its iterative and flexible nature, and detailed description. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. 6th international conference on case histories in geotechnical engineering August 2008conference report.

    DOT National Transportation Integrated Search

    2009-01-01

    Due to uncertainty in the nature of soils, a systematic study of the performance of geotechnical structures and its match with predictions is extremely important. Therefore, considerable research effort is being devoted to geotechnical engineering th...

  18. Uncertainties in biological responses that influence hazard and risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Substances (EDSs) may have certain biological effects including delayed effects, multigenerational effects, and non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evalu...

  19. Computational Toxicology in Cancer Risk Assessment

    EPA Science Inventory

    Risk assessment over the last half century has, for many individual cases served us well, but has proceeded on an extremely slow pace and has left us with considerable uncertainty. There are certainly thousands of compounds and thousands of exposure scenarios that remain unteste...

  20. SETAC: Uncertainties in biological responses that influence hazard or risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Substances (EDSs) may have certain biological effects including delayed effects, multigenerational effects, and non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evalu...

  1. The atmospheric effects of stratospheric aircraft: A fourth program report

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S. (Editor); Wesoky, Howard L. (Editor); Wofsy, Steven C.; Ravishankara, A. R.; Rodriguez, Jose M.; Grose, William L.

    1995-01-01

    This document presents the fourth report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High-Speed Research Program (HSRP). Market and technology considerations continue to provide an impetus for high-speed civil transport research. A recent AESA interim assessment report and a review of that report have shown that considerable uncertainty still exists about the possible impact of aircraft on the atmosphere. The AESA has been designed to develop the body of scientific knowledge necessary for the evaluation of the impact of stratospheric aircraft on the atmosphere. The first Program report presented the basic objectives and plans for AESA. This fourth report comes after the interim assessment and sets forth directions for the 1995 assessment at the end of AESA Phase 1. It also sets forth the goals and directions for AESA Phase 2, as reported at the 1994 Atmospheric Effects of Aviation Project (AEAP) annual meeting held in June. The focus of the Phase 2 effort is to obtain the best possible closure on the outstanding problems identified in the interim assessment and NASA/NRC review. Topics discussed in this report include how high-speed civil transports (HSCT) might affect stratospheric ozone, emissions scenarios and databases to assess potential atmospheric effects from HSCT's, calculated results from 2-D zonal mean models using emissions data, engine trace constituent measurements.

  2. Water as consumed and its impact on the consumer--do we understand the variables?

    PubMed

    Bates, A J

    2000-01-01

    Water is the most important natural resource in the world, without it life cannot exist. In 1854 a cholera outbreak in London caused 10, 000 deaths and positively linked enteric disease with bacterial contamination of drinking water by sewage pollution. Since then, adequate water hygiene standards and sewage purification have played the most significant role in disease eradication and public health improvements everywhere. Standards for drinking water have become an extensive range of microbiological and chemical parametric values. Which has not increased consumer, if the media is to be believed. Customers rightly expect that the water they drink is safe and wholesome. Standard setting is perceived as a precise science and meaningful to health. Is this justified and do scientists and regulators who derive and set the standards understand the uncertainties in the system? Water is the universal solvent, therefore it will never be pure; it will contain impurities prior to and after treatment. Knowledge of its potential to become contaminated is necessary to understand the epidemiology associated with waterborne contaminants and their effects. Water use patterns vary considerably and affect assumptions based on toxicology derived from laboratory studies under tightly controlled conditions. Consideration must be given to the model systems used to assess toxicity and translate results from the laboratory to the real world, if sensible scientifically-based water quality standards are to be set and achieved cost effectively.

  3. Cloud Ice: A Climate Model Challenge With Signs and Expectations of Progress

    NASA Astrophysics Data System (ADS)

    Li, F.; Waliser, D.; Bacmeister, J.; Chern, J.; Del Genio, T.; Jiang, J.; Kharitondov, M.; Liou, K.; Meng, H.; Minnis, P.; Rossow, B.; Stephens, G.; Sun-Mack, S.; Tao, W.; Vane, D.; Woods, C.; Tompkins, A.; Wu, D.

    2007-12-01

    Global climate models (GCMs), including those assessed in the IPCC AR4, exhibit considerable disagreement in the amount of cloud ice - both in terms of the annual global mean as well as their spatial variability. Global measurements of cloud ice have been difficult due to the challenges involved in remotely sensing ice water content (IWC) and its vertical profile - including complications associated with multi-level clouds, mixed-phases and multiple hydrometer types, the uncertainty in classifying ice particle size and shape for remote retrievals, and the relatively small time and space scales associated with deep convection. Together, these measurement difficulties make it a challenge to characterize and understand the mechanisms of ice cloud formation and dissipation. Fortunately, there are new observational resources recently established that can be expected to lead to considerable reduction in the observational uncertainties of cloud ice, and in turn improve the fidelity of model representations. Specifically, these include the Microwave Limb Sounder (MLS) on the Earth Observing System (EOS) Aura satellite, and the CloudSat and Calipso satellite missions, all of which fly in formation in what is referred to as the A-Train. Based on radar and limb-sounding techniques, these new satellite measurements provide a considerable leap forward in terms of the information gathered regarding upper-tropospheric cloud IWC as well as other macrophysical and microphysical properties. In this presentation, we describe the current state of GCM representations of cloud ice and their associated uncertainties, the nature of the new observational resources for constraining cloud ice values in GCMs, the challenges in making model-data comparisons with these data resources, and prospects for near-term improvements in model representations.

  4. Incorporating the effects of socioeconomic uncertainty into priority setting for conservation investment.

    PubMed

    McBride, Marissa F; Wilson, Kerrie A; Bode, Michael; Possingham, Hugh P

    2007-12-01

    Uncertainty in the implementation and outcomes of conservation actions that is not accounted for leaves conservation plans vulnerable to potential changes in future conditions. We used a decision-theoretic approach to investigate the effects of two types of investment uncertainty on the optimal allocation of global conservation resources for land acquisition in the Mediterranean Basin. We considered uncertainty about (1) whether investment will continue and (2) whether the acquired biodiversity assets are secure, which we termed transaction uncertainty and performance uncertainty, respectively. We also developed and tested the robustness of different rules of thumb for guiding the allocation of conservation resources when these sources of uncertainty exist. In the presence of uncertainty in future investment ability (transaction uncertainty), the optimal strategy was opportunistic, meaning the investment priority should be to act where uncertainty is highest while investment remains possible. When there was a probability that investments would fail (performance uncertainty), the optimal solution became a complex trade-off between the immediate biodiversity benefits of acting in a region and the perceived longevity of the investment. In general, regions were prioritized for investment when they had the greatest performance certainty, even if an alternative region was highly threatened or had higher biodiversity value. The improved performance of rules of thumb when accounting for uncertainty highlights the importance of explicitly incorporating sources of investment uncertainty and evaluating potential conservation investments in the context of their likely long-term success.

  5. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has the benefit of increasing the transparency of occupational exposure limit derivation. Improved characterization of the scientific basis for uncertainty factors has led to increasing rigor and transparency in their application as part of the overall occupational exposure limit derivation process.

  7. The role of the uncertainty of measurement of serum creatinine concentrations in the diagnosis of acute kidney injury.

    PubMed

    Kin Tekce, Buket; Tekce, Hikmet; Aktas, Gulali; Uyeturk, Ugur

    2016-01-01

    Uncertainty of measurement is the numeric expression of the errors associated with all measurements taken in clinical laboratories. Serum creatinine concentration is the most common diagnostic marker for acute kidney injury. The goal of this study was to determine the effect of the uncertainty of measurement of serum creatinine concentrations on the diagnosis of acute kidney injury. We calculated the uncertainty of measurement of serum creatinine according to the Nordtest Guide. Retrospectively, we identified 289 patients who were evaluated for acute kidney injury. Of the total patient pool, 233 were diagnosed with acute kidney injury using the AKIN classification scheme and then were compared using statistical analysis. We determined nine probabilities of the uncertainty of measurement of serum creatinine concentrations. There was a statistically significant difference in the number of patients diagnosed with acute kidney injury when uncertainty of measurement was taken into consideration (first probability compared to the fifth p = 0.023 and first probability compared to the ninth p = 0.012). We found that the uncertainty of measurement for serum creatinine concentrations was an important factor for correctly diagnosing acute kidney injury. In addition, based on the AKIN classification scheme, minimizing the total allowable error levels for serum creatinine concentrations is necessary for the accurate diagnosis of acute kidney injury by clinicians.

  8. Emotion and Decision-Making Under Uncertainty: Physiological arousal predicts increased gambling during ambiguity but not risk

    PubMed Central

    FeldmanHall, Oriel; Glimcher, Paul; Baker, Augustus L; Phelps, Elizabeth A

    2016-01-01

    Uncertainty, which is ubiquitous in decision-making, can be fractionated into known probabilities (risk) and unknown probabilities (ambiguity). Although research illustrates that individuals more often avoid decisions associated with ambiguity compared to risk, it remains unclear why ambiguity is perceived as more aversive. Here we examine the role of arousal in shaping the representation of value and subsequent choice under risky and ambiguous decisions. To investigate the relationship between arousal and decisions of uncertainty, we measure skin conductance response—a quantifiable measure reflecting sympathetic nervous system arousal—during choices to gamble under risk and ambiguity. To quantify the discrete influences of risk and ambiguity sensitivity and the subjective value of each option under consideration, we model fluctuating uncertainty, as well as the amount of money that can be gained by taking the gamble. Results reveal that while arousal tracks the subjective value of a lottery regardless of uncertainty type, arousal differentially contributes to the computation of value—i.e. choice—depending on whether the uncertainty is risky or ambiguous: enhanced arousal adaptively decreases risk-taking only when the lottery is highly risky but increases risk-taking when the probability of winning is ambiguous (even after controlling for subjective value). Together, this suggests that the role of arousal during decisions of uncertainty is modulatory and highly dependent on the context in which the decision is framed. PMID:27690508

  9. Trajectory Dispersed Vehicle Process for Space Launch System

    NASA Technical Reports Server (NTRS)

    Statham, Tamara; Thompson, Seth

    2017-01-01

    The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.

  10. Entropy of hydrological systems under small samples: Uncertainty and variability

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Wang, Yuankun; Wu, Jichun; Singh, Vijay P.; Zeng, Xiankui; Wang, Lachun; Chen, Yuanfang; Chen, Xi; Zhang, Liyuan; Gu, Shenghua

    2016-01-01

    Entropy theory has been increasingly applied in hydrology in both descriptive and inferential ways. However, little attention has been given to the small-sample condition widespread in hydrological practice, where either hydrological measurements are limited or are even nonexistent. Accordingly, entropy estimated under this condition may incur considerable bias. In this study, small-sample condition is considered and two innovative entropy estimators, the Chao-Shen (CS) estimator and the James-Stein-type shrinkage (JSS) estimator, are introduced. Simulation tests are conducted with common distributions in hydrology, that lead to the best-performing JSS estimator. Then, multi-scale moving entropy-based hydrological analyses (MM-EHA) are applied to indicate the changing patterns of uncertainty of streamflow data collected from the Yangtze River and the Yellow River, China. For further investigation into the intrinsic property of entropy applied in hydrological uncertainty analyses, correlations of entropy and other statistics at different time-scales are also calculated, which show connections between the concept of uncertainty and variability.

  11. Gaussian Process Interpolation for Uncertainty Estimation in Image Registration

    PubMed Central

    Wachinger, Christian; Golland, Polina; Reuter, Martin; Wells, William

    2014-01-01

    Intensity-based image registration requires resampling images on a common grid to evaluate the similarity function. The uncertainty of interpolation varies across the image, depending on the location of resampled points relative to the base grid. We propose to perform Bayesian inference with Gaussian processes, where the covariance matrix of the Gaussian process posterior distribution estimates the uncertainty in interpolation. The Gaussian process replaces a single image with a distribution over images that we integrate into a generative model for registration. Marginalization over resampled images leads to a new similarity measure that includes the uncertainty of the interpolation. We demonstrate that our approach increases the registration accuracy and propose an efficient approximation scheme that enables seamless integration with existing registration methods. PMID:25333127

  12. Economic and environmental costs of regulatory uncertainty for coal-fired power plants.

    PubMed

    Patiño-Echeverri, Dalia; Fischbeck, Paul; Kriegler, Elmar

    2009-02-01

    Uncertainty about the extent and timing of CO2 emissions regulations for the electricity-generating sector exacerbates the difficulty of selecting investment strategies for retrofitting or alternatively replacing existent coal-fired power plants. This may result in inefficient investments imposing economic and environmental costs to society. In this paper, we construct a multiperiod decision model with an embedded multistage stochastic dynamic program minimizing the expected total costs of plant operation, installations, and pollution allowances. We use the model to forecast optimal sequential investment decisions of a power plant operator with and without uncertainty about future CO2 allowance prices. The comparison of the two cases demonstrates that uncertainty on future CO2 emissions regulations might cause significant economic costs and higher air emissions.

  13. Water resources in the twenty-first century; a study of the implications of climate uncertainty

    USGS Publications Warehouse

    Moss, Marshall E.; Lins, Harry F.

    1989-01-01

    The interactions of the water resources on and within the surface of the Earth with the atmosphere that surrounds it are exceedingly complex. Increased uncertainty can be attached to the availability of water of usable quality in the 21st century, therefore, because of potential anthropogenic changes in the global climate system. For the U.S. Geological Survey to continue to fulfill its mission with respect to assessing the Nation's water resources, an expanded program to study the hydrologic implications of climate uncertainty will be required. The goal for this program is to develop knowledge and information concerning the potential water-resources implications for the United States of uncertainties in climate that may result from both anthropogenic and natural changes of the Earth's atmosphere. Like most past and current water-resources programs of the Geological Survey, the climate-uncertainty program should be composed of three elements: (1) research, (2) data collection, and (3) interpretive studies. However, unlike most other programs, the climate-uncertainty program necessarily will be dominated by its research component during its early years. Critical new concerns to be addressed by the research component are (1) areal estimates of evapotranspiration, (2) hydrologic resolution within atmospheric (climatic) models at the global scale and at mesoscales, (3) linkages between hydrology and climatology, and (4) methodology for the design of data networks that will help to track the impacts of climate change on water resources. Other ongoing activities in U.S. Geological Survey research programs will be enhanced to make them more compatible with climate-uncertainty research needs. The existing hydrologic data base of the Geological Survey serves as a key element in assessing hydrologic and climatologic change. However, this data base has evolved in response to other needs for hydrologic information and probably is not as sensitive to climate change as is desirable. Therefore, as measurement and network-design methodologies are improved to account for climate-change potential, new data-collection activities will be added to the existing programs. One particular area of data-collection concern pertains to the phenomenon of evapotranspiration. Interpretive studies of the hydrologic implications of climate uncertainty will be initiated by establishing several studies at the river-basin scale in diverse hydroclimatic and demographic settings. These studies will serve as tests of the existing methodologies for studying the impacts of climate change and also will help to define subsequent research priorities. A prototype for these studies was initiated in early 1988 in the Delaware River basin.

  14. Uncertainty of inhalation dose coefficients for representative physical and chemical forms of iodine-131

    NASA Astrophysics Data System (ADS)

    Harvey, Richard Paul, III

    Releases of radioactive material have occurred at various Department of Energy (DOE) weapons facilities and facilities associated with the nuclear fuel cycle in the generation of electricity. Many different radionuclides have been released to the environment with resulting exposure of the population to these various sources of radioactivity. Radioiodine has been released from a number of these facilities and is a potential public health concern due to its physical and biological characteristics. Iodine exists as various isotopes, but our focus is on 131I due to its relatively long half-life, its prevalence in atmospheric releases and its contribution to offsite dose. The assumption of physical and chemical form is speculated to have a profound impact on the deposition of radioactive material within the respiratory tract. In the case of iodine, it has been shown that more than one type of physical and chemical form may be released to, or exist in, the environment; iodine can exist as a particle or as a gas. The gaseous species can be further segregated based on chemical form: elemental, inorganic, and organic iodides. Chemical compounds in each class are assumed to behave similarly with respect to biochemistry. Studies at Oak Ridge National Laboratories have demonstrated that 131I is released as a particulate, as well as in elemental, inorganic and organic chemical form. The internal dose estimate from 131I may be very different depending on the effect that chemical form has on fractional deposition, gas uptake, and clearance in the respiratory tract. There are many sources of uncertainty in the estimation of environmental dose including source term, airborne transport of radionuclides, and internal dosimetry. Knowledge of uncertainty in internal dosimetry is essential for estimating dose to members of the public and for determining total uncertainty in dose estimation. Important calculational steps in any lung model is regional estimation of deposition fractions and gas uptake of radionuclides in various regions of the lung. Variability in regional radionuclide deposition within lung compartments may significantly contribute to the overall uncertainty of the lung model. The uncertainty of lung deposition and biological clearance is dependent upon physiological and anatomical parameters of individuals as well as characteristic parameters of the particulate material. These parameters introduce uncertainty into internal dose estimates due to their inherent variability. Anatomical and physiological input parameters are age and gender dependent. This work has determined the uncertainty in internal dose estimates and the sensitive parameters involved in modeling particulate deposition and gas uptake of different physical and chemical forms of 131I with age and gender dependencies.

  15. Monitoring, reporting and verifying emissions in the climate economy

    NASA Astrophysics Data System (ADS)

    Bellassen, Valentin; Stephan, Nicolas; Afriat, Marion; Alberola, Emilie; Barker, Alexandra; Chang, Jean-Pierre; Chiquet, Caspar; Cochran, Ian; Deheza, Mariana; Dimopoulos, Christopher; Foucherot, Claudine; Jacquier, Guillaume; Morel, Romain; Robinson, Roderick; Shishlov, Igor

    2015-04-01

    The monitoring, reporting and verification (MRV) of greenhouse-gas emissions is the cornerstone of carbon pricing and management mechanisms. Here we consider peer-reviewed articles and 'grey literature' related to existing MRV requirements and their costs. A substantial part of the literature is the regulatory texts of the 15 most important carbon pricing and management mechanisms currently implemented. Based on a comparison of key criteria such as the scope, cost, uncertainty and flexibility of procedures, we conclude that conventional wisdom on MRV is not often promoted in existing carbon pricing mechanisms. Quantification of emissions uncertainty and incentives to reduce this uncertainty are usually only partially applied, if at all. Further, the time and resources spent on small sources of emissions would be expected to be limited. Although provisions aiming at an effort proportionate to the amount of emissions at stake -- 'materiality' -- are widespread, they are largely outweighed by economies of scale: in all schemes, MRV costs per tonne are primarily driven by the size of the source.

  16. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  17. Uncertainty in Estimates of Net Seasonal Snow Accumulation on Glaciers from In Situ Measurements

    NASA Astrophysics Data System (ADS)

    Pulwicki, A.; Flowers, G. E.; Radic, V.

    2017-12-01

    Accurately estimating the net seasonal snow accumulation (or "winter balance") on glaciers is central to assessing glacier health and predicting glacier runoff. However, measuring and modeling snow distribution is inherently difficult in mountainous terrain, resulting in high uncertainties in estimates of winter balance. Our work focuses on uncertainty attribution within the process of converting direct measurements of snow depth and density to estimates of winter balance. We collected more than 9000 direct measurements of snow depth across three glaciers in the St. Elias Mountains, Yukon, Canada in May 2016. Linear regression (LR) and simple kriging (SK), combined with cross correlation and Bayesian model averaging, are used to interpolate estimates of snow water equivalent (SWE) from snow depth and density measurements. Snow distribution patterns are found to differ considerably between glaciers, highlighting strong inter- and intra-basin variability. Elevation is found to be the dominant control of the spatial distribution of SWE, but the relationship varies considerably between glaciers. A simple parameterization of wind redistribution is also a small but statistically significant predictor of SWE. The SWE estimated for one study glacier has a short range parameter (90 m) and both LR and SK estimate a winter balance of 0.6 m w.e. but are poor predictors of SWE at measurement locations. The other two glaciers have longer SWE range parameters ( 450 m) and due to differences in extrapolation, SK estimates are more than 0.1 m w.e. (up to 40%) lower than LR estimates. By using a Monte Carlo method to quantify the effects of various sources of uncertainty, we find that the interpolation of estimated values of SWE is a larger source of uncertainty than the assignment of snow density or than the representation of the SWE value within a terrain model grid cell. For our study glaciers, the total winter balance uncertainty ranges from 0.03 (8%) to 0.15 (54%) m w.e. depending primarily on the interpolation method. Despite the challenges associated with accurately and precisely estimating winter balance, our results are consistent with the previously reported regional accumulation gradient.

  18. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  19. Buoyancy contribution to uncertainty of mass, conventional mass and force

    NASA Astrophysics Data System (ADS)

    Malengo, Andrea; Bich, Walter

    2016-04-01

    The conventional mass is a useful concept introduced to reduce the impact of the buoyancy correction in everyday mass measurements, thus avoiding in most cases its accurate determination, necessary in measurements of ‘true’ mass. Although usage of conventional mass is universal and standardized, the concept is considered as a sort of second-choice tool, to be avoided in high-accuracy applications. In this paper we show that this is a false belief, by elucidating the role played by covariances between volume and mass and between volume and conventional mass at the various stages of the dissemination chain and in the relationship between the uncertainties of mass and conventional mass. We arrive at somewhat counter-intuitive results: the volume of the transfer standard plays a comparatively minor role in the uncertainty budget of the standard under calibration. In addition, conventional mass is preferable to mass in normal, in-air operation, as its uncertainty is smaller than that of mass, if covariance terms are properly taken into account, and the uncertainty over-stating (typically) resulting from neglecting them is less severe than that (always) occurring with mass. The same considerations hold for force. In this respect, we show that the associated uncertainty is the same using mass or conventional mass, and, again, that the latter is preferable if covariance terms are neglected.

  20. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    PubMed

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  1. Artificial neural network modelling of uncertainty in gamma-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Dragović, S.; Onjia, A.; Stanković, S.; Aničin, I.; Bačić, G.

    2005-03-01

    An artificial neural network (ANN) model for the prediction of measuring uncertainties in gamma-ray spectrometry was developed and optimized. A three-layer feed-forward ANN with back-propagation learning algorithm was used to model uncertainties of measurement of activity levels of eight radionuclides ( 226Ra, 238U, 235U, 40K, 232Th, 134Cs, 137Cs and 7Be) in soil samples as a function of measurement time. It was shown that the neural network provides useful data even from small experimental databases. The performance of the optimized neural network was found to be very good, with correlation coefficients ( R2) between measured and predicted uncertainties ranging from 0.9050 to 0.9915. The correlation coefficients did not significantly deteriorate when the network was tested on samples with greatly different uranium-to-thorium ( 238U/ 232Th) ratios. The differences between measured and predicted uncertainties were not influenced by the absolute values of uncertainties of measured radionuclide activities. Once the ANN is trained, it could be employed in analyzing soil samples regardless of the 238U/ 232Th ratio. It was concluded that a considerable saving in time could be obtained using the trained neural network model for predicting the measurement times needed to attain the desired statistical accuracy.

  2. Proteomic analysis of a model fish species exposed to individual pesticides and a binary mixture

    EPA Science Inventory

    Aquatic organisms are often exposed to multiple pesticides simultaneously. Due to the relatively poor characterization of mixture constituent interactions and the potential for highly complex exposure scenarios, there is considerable uncertainty in understanding the toxicity of m...

  3. Uncertainties in biological responses that influence hazard or risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Chemicals (EDCs) may have delayed or transgenerational effects and display non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evaluated for the SETAC Pellston Workshop&...

  4. Does grazing management matter for soil carbon sequestration in shortgrass steppe?

    USDA-ARS?s Scientific Manuscript database

    Considerable uncertainty remains regarding the potential of grazing management on semiarid rangelands to sequester soil carbon. Short-term (less than 1 decade) studies have determined that grazing management potentially influences fluxes of carbon, but such studies are strongly influenced by prevail...

  5. A study on the impact of parameter uncertainty on the emission-based ranking of transportation projects.

    DOT National Transportation Integrated Search

    2014-01-01

    With the growing concern with air quality levels and, hence, the livability of urban regions in the nation, it has become increasingly common to incorporate vehicular emission considerations in the ranking of transportation projects. Network assignme...

  6. Risk management consideration in the bioeconomy

    Treesearch

    Camilla Abbati de Assis; Ronalds Gonzalez; Stephen Kelley; Hasan Jameel; Ted Bilek; Jesse Daystar; Robert Handfield; Jay Golden; Jeff Prestemon; Damien Singh

    2017-01-01

    In investing in a new venture, companies aim to increase their competitiveness and generate value in scenarios where volatile markets, geopolitical instabilities, and disruptive technologies create uncertainty and risk. The biobased industry poses additional challenges as it competes in a mature, highly efficient market, dominated by...

  7. Credibilistic multi-period portfolio optimization based on scenario tree

    NASA Astrophysics Data System (ADS)

    Mohebbi, Negin; Najafi, Amir Abbas

    2018-02-01

    In this paper, we consider a multi-period fuzzy portfolio optimization model with considering transaction costs and the possibility of risk-free investment. We formulate a bi-objective mean-VaR portfolio selection model based on the integration of fuzzy credibility theory and scenario tree in order to dealing with the markets uncertainty. The scenario tree is also a proper method for modeling multi-period portfolio problems since the length and continuity of their horizon. We take the return and risk as well cardinality, threshold, class, and liquidity constraints into consideration for further compliance of the model with reality. Then, an interactive dynamic programming method, which is based on a two-phase fuzzy interactive approach, is employed to solve the proposed model. In order to verify the proposed model, we present an empirical application in NYSE under different circumstances. The results show that the consideration of data uncertainty and other real-world assumptions lead to more practical and efficient solutions.

  8. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    PubMed

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  9. On entropic uncertainty relations in the presence of a minimal length

    NASA Astrophysics Data System (ADS)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  10. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study

    PubMed Central

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-01-01

    Background There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g. – cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. Methods A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. Results There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Conclusion Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g. – biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to pateint care. PMID:17462089

  11. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study.

    PubMed

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-04-26

    There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g.--cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g.--biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to patient care.

  12. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  13. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  14. When, not if: The inescapability of an uncertain future

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Ballard, T.

    2014-12-01

    Uncertainty is an inherent feature of most scientific endeavours, and many political decisions must be made in the presence of scientific uncertainty. In the case of climate change, there is evidence that greater scientific uncertainty increases the risk associated with the impact of climate change. Scientific uncertainty thus provides an impetus for cutting emissions rather than delaying action. In contrast to those normative considerations, uncertainty is frequently cited in political and public discourse as a reason to delay mitigation. We examine ways in which this gap between public and scientific understanding of uncertainty can be bridged. In particular, we sought ways to communicate uncertainty in a way that better calibrates people's risk perceptions with the projected impact of climate change. We report two behavioural experiments in which uncertainty about the future was expressed either as outcome uncertainty or temporal uncertainty. The conventional presentation of uncertainty involves uncertainty about an outcome at a given time—for example, the range of possible sea level rise (say 50cm +/- 20cm) by a certain date. An alternative presentation of the same situation presents a certain outcome ("sea levels will rise by 50cm") but places the uncertainty into the time of arrival ("this may occur as early as 2040 or as late as 2080"). We presented participants with a series of statements and graphs indicating projected increases in temperature, sea levels, ocean acidification, and a decrease in artic sea ice. In the uncertain magnitude condition, the statements and graphs reported the upper and lower confidence bounds of the projected magnitude and the mean projected time of arrival. In the uncertain time of arrival condition, they reported the upper and lower confidence bounds of the projected time of arrival and the mean projected magnitude. The results show that when uncertainty was presented as uncertain time of arrival rather than an uncertain outcome, people expressed greater concern about the projected outcomes. In a further experiment involving repeated "games" with a simulated economy, we similarly showed that people allocate more resources to mitigation if there is uncertainty about the timing of an adverse event rather than about the magnitude of its impact.

  15. Managing uncertainty in flood protection planning with climate projections

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in uncertainty as well as in trend. In contrast, planning without consideration of bias and dependencies in and between uncertainty components leads to strongly suboptimal planning recommendations.

  16. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less

  17. Structured decision making for managing pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.

    2016-01-01

    Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.

  18. Impact of Biogenic Emission Uncertainties on the Simulated Response of Ozone and Fine Particulate Matter to Anthropogenic Emission Reductions

    PubMed Central

    Hogrefe, Christian; Isukapalli, Sastry S.; Tang, Xiaogang; Georgopoulos, Panos G.; He, Shan; Zalewsky, Eric E.; Hao, Winston; Ku, Jia-Yeong; Key, Tonalee; Sistla, Gopal

    2011-01-01

    The role of emissions of volatile organic compounds and nitric oxide from biogenic sources is becoming increasingly important in regulatory air quality modeling as levels of anthropogenic emissions continue to decrease and stricter health-based air quality standards are being adopted. However, considerable uncertainties still exist in the current estimation methodologies for biogenic emissions. The impact of these uncertainties on ozone and fine particulate matter (PM2.5) levels for the eastern United States was studied, focusing on biogenic emissions estimates from two commonly used biogenic emission models, the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and the Biogenic Emissions Inventory System (BEIS). Photochemical grid modeling simulations were performed for two scenarios: one reflecting present day conditions and the other reflecting a hypothetical future year with reductions in emissions of anthropogenic oxides of nitrogen (NOx). For ozone, the use of MEGAN emissions resulted in a higher ozone response to hypothetical anthropogenic NOx emission reductions compared with BEIS. Applying the current U.S. Environmental Protection Agency guidance on regulatory air quality modeling in conjunction with typical maximum ozone concentrations, the differences in estimated future year ozone design values (DVF) stemming from differences in biogenic emissions estimates were on the order of 4 parts per billion (ppb), corresponding to approximately 5% of the daily maximum 8-hr ozone National Ambient Air Quality Standard (NAAQS) of 75 ppb. For PM2.5, the differences were 0.1–0.25 μg/m3 in the summer total organic mass component of DVFs, corresponding to approximately 1–2% of the value of the annual PM2.5 NAAQS of 15 μg/m3. Spatial variations in the ozone and PM2.5 differences also reveal that the impacts of different biogenic emission estimates on ozone and PM2.5 levels are dependent on ambient levels of anthropogenic emissions. PMID:21305893

  19. Towards active image-guidance: tracking of a fiducial in the thorax during respiration under X-ray fluoroscopy

    NASA Astrophysics Data System (ADS)

    Siddique, Sami; Jaffray, David

    2007-03-01

    A central purpose of image-guidance is to assist the interventionalist with feedback of geometric performance in the direction of therapy delivery. Tradeoffs exist between accuracy, precision and the constraints imposed by parameters used in the generation of images. A framework that uses geometric performance as feedback to control these parameters can balance such tradeoffs in order to maintain the requisite localization precision for a given clinical procedure. We refer to this principle as Active Image-Guidance (AIG). This framework requires estimates of the uncertainty in the estimated location of the object of interest. In this study, a simple fiducial marker detected under X-ray fluoroscopy is considered and it is shown that a relation exists between the applied imaging dose and the uncertainty in localization for a given observer. A robust estimator of the location of a fiducial in the thorax during respiration under X-ray fluoroscopy is demonstrated using a particle filter based approach that outputs estimates of the location and the associated spatial uncertainty. This approach gives an rmse of 1.3mm and the uncertainty estimates are found to be correlated with the error in the estimates. Furthermore, the particle filtering approach is employed to output location estimates and the associated uncertainty not only at instances of pulsed exposure but also between exposures. Such a system has applications in image-guided interventions (surgery, radiotherapy, interventional radiology) where there are latencies between the moment of imaging and the act of intervention.

  20. Accounting for uncertainty in health economic decision models by using model averaging.

    PubMed

    Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D

    2009-04-01

    Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment.

  1. Spatially Distributed Assimilation of Remotely Sensed Leaf Area Index and Potential Evapotranspiration for Hydrologic Modeling in Wetland Landscapes

    EPA Science Inventory

    Evapotranspiration (ET), a highly dynamic flux in wetland landscapes, regulates the accuracy of surface/sub-surface runoff simulation in a hydrologic model. However, considerable uncertainty in simulating ET-related processes remains, including our limited ability to incorporate ...

  2. BEST MANAGEMENT PRACTICES FOR THE CONTROL OF NUTRIENTS FROM URBAN NONPOINT SOURCES

    EPA Science Inventory

    While the costs and benefits associated with the point source control of nutrients are relatively well defined, considerable uncertainties remain in the efficiency and long-term costs associated with the best management practices (BMPs) used to redcuce loads from nonpoint and dif...

  3. Consideration of the FQPA Safety Factor and Other Uncertainty Factors in Cumulative Risk Assessment of Chemicals Sharing a Common Mechanism of Toxicity

    EPA Pesticide Factsheets

    This guidance document provides OPP's current thinking on application of the provision in FFDCA about an additional safety factor for the protection of infants and children in the context of cumulative risk assessments.

  4. The role of integrative, whole organism testing in monitoring applications: Back to the future

    EPA Science Inventory

    The biological effects of chemicals released to surface waters continue to be an area of uncertainty in risk assessment and risk management. Based on conventional risk assessment considerations, adequate exposure and effects information are required to reach a scientifically soun...

  5. Evidence for moxifloxacin in community-acquired pneumonia: the impact of pharmaco-economic considerations on guidelines.

    PubMed

    Simoens, Steven

    2009-10-01

    In an era of limited resources, policy makers and health care payers are concerned about the costs of treatment in addition to its effectiveness. However, guidelines do not tend to consider the cost-effectiveness of treatment options. This paper aims to conduct an international literature review with a view to assessing the impact of pharmaco-economic considerations of CAP treatment with moxifloxacin on recent guidelines. The pharmaco-economic state of the art of treating CAP with moxifloxacin is assessed and compared with guidelines issued by the European Respiratory Society and by the Infectious Diseases Society of America/American Thoracic Society. Also, evidence on moxifloxacin consumption and antimicrobial resistance, and the impact of resistance on the cost-effectiveness of moxifloxacin is reviewed. Studies were identified by searching PubMed, Centre for Reviews and Dissemination databases, Cochrane Database of Systematic Reviews, and EconLit up to January 2009. The existing pharmaco-economic evidence indicates that moxifloxacin is a cost-effective treatment for CAP. However, data limitations and uncertainty surrounding the evolution of resistance emphasize the need for caution. As recommended by guidelines, the choice of antimicrobial should consider the local frequency of causative pathogens, the local pattern of antimicrobial resistance, and risk factors for resistant bacteria. The pharmaco-economic evidence corroborates the importance of these factors as they have an impact on the cost-effectiveness of treating CAP patients with moxifloxacin. CAP guidelines need to take into account pharmaco-economic considerations by balancing the effectiveness of antimicrobial regimens against their costs. The pharmaco-economic value of moxifloxacin is influenced by the causative pathogens involved and resistance patterns. Therefore, it may be advisable to identify patient subgroups in which treatment with moxifloxacin is cost-effective and should be recommended by guidelines.

  6. Impact of 4D image quality on the accuracy of target definition.

    PubMed

    Nielsen, Tine Bjørn; Hansen, Christian Rønn; Westberg, Jonas; Hansen, Olfred; Brink, Carsten

    2016-03-01

    Delineation accuracy of target shape and position depends on the image quality. This study investigates whether the image quality on standard 4D systems has an influence comparable to the overall delineation uncertainty. A moving lung target was imaged using a dynamic thorax phantom on three different 4D computed tomography (CT) systems and a 4D cone beam CT (CBCT) system using pre-defined clinical scanning protocols. Peak-to-peak motion and target volume were registered using rigid registration and automatic delineation, respectively. A spatial distribution of the imaging uncertainty was calculated as the distance deviation between the imaged target and the true target shape. The measured motions were smaller than actual motions. There were volume differences of the imaged target between respiration phases. Imaging uncertainties of >0.4 cm were measured in the motion direction which showed that there was a large distortion of the imaged target shape. Imaging uncertainties of standard 4D systems are of similar size as typical GTV-CTV expansions (0.5-1 cm) and contribute considerably to the target definition uncertainty. Optimising and validating 4D systems is recommended in order to obtain the most optimal imaged target shape.

  7. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  8. Presentation of uncertainties on web platforms for climate change information

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Wrobel, Markus; Reusser, Dominik

    2014-05-01

    Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.

  9. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  10. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

  11. Radiometer uncertainty equation research of 2D planar scanning PMMW imaging system

    NASA Astrophysics Data System (ADS)

    Hu, Taiyang; Xu, Jianzhong; Xiao, Zelong

    2009-07-01

    With advances in millimeter-wave technology, passive millimeter-wave (PMMW) imaging technology has received considerable concerns, and it has established itself in a wide range of military and civil practical applications, such as in the areas of remote sensing, blind landing, precision guidance and security inspection. Both the high transparency of clothing at millimeter wavelengths and the spatial resolution required to generate adequate images combine to make imaging at millimeter wavelengths a natural approach of screening people for concealed contraband detection. And at the same time, the passive operation mode does not present a safety hazard to the person who is under inspection. Based on the description to the design and engineering implementation of a W-band two-dimensional (2D) planar scanning imaging system, a series of scanning methods utilized in PMMW imaging are generally compared and analyzed, followed by a discussion on the operational principle of the mode of 2D planar scanning particularly. Furthermore, it is found that the traditional radiometer uncertainty equation, which is derived from a moving platform, does not hold under this 2D planar scanning mode due to the fact that there is no absolute connection between the scanning rates in horizontal direction and vertical direction. Consequently, an improved radiometer uncertainty equation is carried out in this paper, by means of taking the total time spent on scanning and imaging into consideration, with the purpose of solving the problem mentioned above. In addition, the related factors which affect the quality of radiometric images are further investigated under the improved radiometer uncertainty equation, and ultimately some original results are presented and analyzed to demonstrate the significance and validity of this new methodology.

  12. Uncertainty quantification of Antarctic contribution to sea-level rise using the fast Elementary Thermomechanical Ice Sheet (f.ETISh) model

    NASA Astrophysics Data System (ADS)

    Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel

    2017-04-01

    Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.

  13. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  14. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.

  15. Exploring Best Practice Skills to Predict Uncertainties in Venture Capital Investment Decision-Making

    NASA Astrophysics Data System (ADS)

    Blum, David Arthur

    Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..

  16. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  17. Application of identified sensitive physical parameters in reducing the uncertainty of numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2016-04-01

    An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  18. Common but unappreciated sources of error in one, two, and multiple-color pyrometry

    NASA Technical Reports Server (NTRS)

    Spjut, R. Erik

    1988-01-01

    The most common sources of error in optical pyrometry are examined. They can be classified as either noise and uncertainty errors, stray radiation errors, or speed-of-response errors. Through judicious choice of detectors and optical wavelengths the effect of noise errors can be minimized, but one should strive to determine as many of the system properties as possible. Careful consideration of the optical-collection system can minimize stray radiation errors. Careful consideration must also be given to the slowest elements in a pyrometer when measuring rapid phenomena.

  19. 78 FR 25204 - Segregation of Lands-Renewable Energy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-30

    ....L13400000] RIN 1004-AE19 Segregation of Lands--Renewable Energy AGENCY: Bureau of Land Management, Interior... pending solar or wind renewable energy generation project, or for public lands identified by the BLM under... consideration of renewable energy ROWs. As explained below, the BLM seeks to avoid the delays and uncertainty...

  20. Estimating US federal wildland fire managers' preferences toward competing strategic suppression objectives

    Treesearch

    David E. Calkin; Tyron Venn; Matthew Wibbenmeyer; Matthew P. Thompson

    2012-01-01

    Wildfire management involves significant complexity and uncertainty, requiring simultaneous consideration of multiple, non-commensurate objectives. This paper investigates the tradeoffs fire managers are willing to make among these objectives using a choice experiment methodology that provides three key advancements relative to previous stated-preference studies...

  1. Multiple microbial activity-based measures reflect effects of cover cropping and tillage on soils

    USDA-ARS?s Scientific Manuscript database

    Agricultural producers, conservation professionals, and policy makers are eager to learn of soil analytical techniques and data that document improvement in soil health by agricultural practices such as no-till and incorporation of cover crops. However, there is considerable uncertainty within the r...

  2. 77 FR 29257 - Registration of Copyright: Definition of Claimant

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... considerable legal uncertainty while offering no clear benefits to the registration system. Removing it will... individuals or entities that have obtained the contractual right to claim legal title to copyright in an... author, the contractual right to claim legal title to the copyright in an application for copyright...

  3. Predicting the unpredictable: potential climate change impacts on vegetation in the Pacific Northwest

    Treesearch

    Marie Oliver; David W. Peterson; Becky Kerns

    2016-01-01

    Earth's climate is changing, as evidenced by warming temperatures, increased temperature variability, fluctuating precipitation patterns, and climate-related environmental disturbances. And with considerable uncertainty about the future, Forest Service land managers are now considering climate change adaptation in their planning efforts. They want practical...

  4. Integrability and Chaos: The Classical Uncertainty

    ERIC Educational Resources Information Center

    Masoliver, Jaume; Ros, Ana

    2011-01-01

    In recent years there has been a considerable increase in the publishing of textbooks and monographs covering what was formerly known as random or irregular deterministic motion, now referred to as deterministic chaos. There is still substantial interest in a matter that is included in many graduate and even undergraduate courses on classical…

  5. Risk-Aversion: Understanding Teachers' Resistance to Technology Integration

    ERIC Educational Resources Information Center

    Howard, Sarah K.

    2013-01-01

    Teachers who do not integrate technology are often labelled as "resistant" to change. Yet, considerable uncertainties remain about appropriate uses and actual value of technology in teaching and learning, which can make integration and change seem risky. The purpose of this article is to explore the nature of teachers' analytical and…

  6. Hydrologic influences on stream temperatures for Little Creek and Scotts Creek, Santa Cruz County, California

    Treesearch

    Justin M. Louen; Christopher G. Surfleet

    2017-01-01

    Stream temperature impacts have resulted in increased restrictions on land management, such as timber harvest and riparian restoration, creating considerable uncertainty for future planning and management of redwood (Sequoia sempervirens (D.Don) Endl.) forestlands. Challenges remain in the assessment of downstream cumulative stream...

  7. Learned Helplessness and Dyslexia: A Carts and Horses Issue?

    ERIC Educational Resources Information Center

    Kerr, H.

    2001-01-01

    Surveys attitudes towards and beliefs about dyslexia among Adult Basic Education (ABE) teachers and providers. Finds doubt, uncertainty and confusion about dyslexia and considerable misgiving. Discusses attribution theory and learned helplessness in the context of ABE. Argues that a diagnosis of dyslexia may be a maladaptive attribution and so…

  8. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  9. Emotion and decision-making under uncertainty: Physiological arousal predicts increased gambling during ambiguity but not risk.

    PubMed

    FeldmanHall, Oriel; Glimcher, Paul; Baker, Augustus L; Phelps, Elizabeth A

    2016-10-01

    Uncertainty, which is ubiquitous in decision-making, can be fractionated into known probabilities (risk) and unknown probabilities (ambiguity). Although research has illustrated that individuals more often avoid decisions associated with ambiguity compared to risk, it remains unclear why ambiguity is perceived as more aversive. Here we examine the role of arousal in shaping the representation of value and subsequent choice under risky and ambiguous decisions. To investigate the relationship between arousal and decisions of uncertainty, we measure skin conductance response-a quantifiable measure reflecting sympathetic nervous system arousal-during choices to gamble under risk and ambiguity. To quantify the discrete influences of risk and ambiguity sensitivity and the subjective value of each option under consideration, we model fluctuating uncertainty, as well as the amount of money that can be gained by taking the gamble. Results reveal that although arousal tracks the subjective value of a lottery regardless of uncertainty type, arousal differentially contributes to the computation of value-that is, choice-depending on whether the uncertainty is risky or ambiguous: Enhanced arousal adaptively decreases risk-taking only when the lottery is highly risky but increases risk-taking when the probability of winning is ambiguous (even after controlling for subjective value). Together, this suggests that the role of arousal during decisions of uncertainty is modulatory and highly dependent on the context in which the decision is framed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Quality versus quantity: The complexities of quality of life determinations for neonatal nurses.

    PubMed

    Green, Janet; Darbyshire, Philip; Adams, Anne; Jackson, Debra

    2017-11-01

    The ability to save the life of an extremely premature baby has increased substantially over the last decade. This survival, however, can be associated with unfavourable outcomes for both baby and family. Questions are now being asked about quality of life for survivors of extreme prematurity. Quality of life is rightly deemed to be an important consideration in high technology neonatal care; yet, it is notoriously difficult to determine or predict. How does one define and operationalise what is considered to be in the best interest of a surviving extremely premature baby, especially when the full extent of the outcomes might not be known for several years? The research investigates the caregiving dilemmas often faced by neonatal nurses when caring for extremely premature babies. This article explores the issues arising for neonatal nurses when they considered the philosophical and ethical questions about quality of life in babies ≤24 weeks gestation. Data were collected via a questionnaire to Australian neonatal nurses and semi-structured interviews with 24 neonatal nurses in New South Wales, Australia. Ethical considerations: Ethical processes and procedures have been adhered to by the researchers. A qualitative approach was used to analyse the data. The theme 'difficult choices' was generated which comprised three sub-themes: 'damaged through survival', 'the importance of the brain' and 'families are important'. The results show that neonatal nurses believed that quality of life was an important consideration; yet they experienced significant inner conflict and uncertainty when asked to define or suggest specific elements of quality of life, or to suggest how it might be determined. It was even more difficult for the nurses to say when an extremely premature baby's life possessed quality. Their previous clinical and personal experiences led the nurses to believe that the quality of the family's life was important, and possibly more so than the quality of life of the surviving baby. This finding contrasts markedly with much of the existing literature in this field. Quality of life for extremely premature babies was an important consideration for neonatal nurses; however, they experienced difficulty deciding how to operationalise such considerations in their everyday clinical practice.

  11. Concepts of ‘personalization’ in personalized medicine: implications for economic evaluation

    PubMed Central

    Rogowski, Wolf; Payne, Katherine; Schnell-Inderst, Petra; Manca, Andrea; Rochau, Ursula; Jahn, Beate; Alagoz, Oguzhan; Leidl, Reiner; Siebert, Uwe

    2015-01-01

    Context This paper assesses if, and how, existing methods for economic evaluation are applicable to the evaluation of PM and if not, where extension to methods may be required. Method Structured workshop with a pre-defined group of experts (n=47), run using a modified nominal group technique. Workshop findings were recorded using extensive note taking and summarised using thematic data analysis. The workshop was complemented by structured literature searches. Results The key finding emerging from the workshop, using an economic perspective, was that two distinct, but linked, interpretations of the concept of PM exist (personalization by ‘physiology’ or ‘preferences’). These interpretations involve specific challenges for the design and conduct of economic evaluations. Existing evaluative (extra-welfarist) frameworks were generally considered appropriate for evaluating PM. When ‘personalization’ is viewed as using physiological biomarkers, challenges include: representing complex care pathways; representing spill-over effects; meeting data requirements such as evidence on heterogeneity; choosing appropriate time horizons for the value of further research in uncertainty analysis. When viewed as tailoring medicine to patient preferences, further work is needed regarding: revealed preferences, e.g. treatment (non)adherence; stated preferences, e.g. risk interpretation and attitude; consideration of heterogeneity in preferences; and the appropriate framework (welfarism vs. extra-welfarism) to incorporate non-health benefits. Conclusion Ideally, economic evaluations should take account of both interpretations of PM and consider physiology and preferences. It is important for decision makers to be cognizant of the issues involved with the economic evaluation of PM to appropriately interpret the evidence and target future research funding. PMID:25249200

  12. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    NASA Astrophysics Data System (ADS)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.

  13. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    NASA Astrophysics Data System (ADS)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most important parameter to decrease the uncertainty space is to avoid drilling steeper than about -80°.

  14. Uncertainties in building a strategic defense.

    PubMed

    Zraket, C A

    1987-03-27

    Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.

  15. Non-Static error tracking control for near space airship loading platform

    NASA Astrophysics Data System (ADS)

    Ni, Ming; Tao, Fei; Yang, Jiandong

    2018-01-01

    A control scheme based on internal model with non-static error is presented against the uncertainty of the near space airship loading platform system. The uncertainty in the tracking table is represented as interval variations in stability and control derivatives. By formulating the tracking problem of the uncertainty system as a robust state feedback stabilization problem of an augmented system, sufficient condition for the existence of robust tracking controller is derived in the form of linear matrix inequality (LMI). Finally, simulation results show that the new method not only has better anti-jamming performance, but also improves the dynamic performance of the high-order systems.

  16. Considering Risk and Resilience in Decision-Making

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This paper examines the concepts of decision-making, risk analysis, uncertainty and resilience analysis. The relation between risk, vulnerability, and resilience is analyzed. The paper describes how complexity, uncertainty, and ambiguity are the most critical factors in the definition of the approach and criteria for decision-making. Uncertainty in its various forms is what limits our ability to offer definitive answers to questions about the outcomes of alternatives in a decision-making process. It is shown that, although resilience-informed decision-making would seem fundamentally different from risk-informed decision-making, this is not the case as resilience-analysis can be easily incorporated within existing analytic-deliberative decision-making frameworks.

  17. Seismotectonic framework of the 2010 February 27 Mw 8.8 Maule, Chile earthquake sequence

    USGS Publications Warehouse

    Hayes, Gavin P.; Bergman, Eric; Johnson, Kendra J.; Benz, Harley M.; Brown, Lucy; Meltzer, Anne S.

    2013-01-01

    After the 2010 Mw 8.8 Maule earthquake, an international collaboration involving teams and instruments from Chile, the US, the UK, France and Germany established the International Maule Aftershock Deployment temporary network over the source region of the event to facilitate detailed, open-access studies of the aftershock sequence. Using data from the first 9-months of this deployment, we have analyzed the detailed spatial distribution of over 2500 well-recorded aftershocks. All earthquakes have been relocated using a hypocentral decomposition algorithm to study the details of and uncertainties in both their relative and absolute locations. We have computed regional moment tensor solutions for the largest of these events to produce a catalogue of 465 mechanisms, and have used all of these data to study the spatial distribution of the aftershock sequence with respect to the Chilean megathrust. We refine models of co-seismic slip distribution of the Maule earthquake, and show how small changes in fault geometries assumed in teleseismic finite fault modelling significantly improve fits to regional GPS data, implying that the accuracy of rapid teleseismic fault models can be substantially improved by consideration of existing fault geometry model databases. We interpret all of these data in an integrated seismotectonic framework for the Maule earthquake rupture and its aftershock sequence, and discuss the relationships between co-seismic rupture and aftershock distributions. While the majority of aftershocks are interplate thrust events located away from regions of maximum co-seismic slip, interesting clusters of aftershocks are identified in the lower plate at both ends of the main shock rupture, implying internal deformation of the slab in response to large slip on the plate boundary interface. We also perform Coulomb stress transfer calculations to compare aftershock locations and mechanisms to static stress changes following the Maule rupture. Without the incorporation of uncertainties in earthquake locations, just 55 per cent of aftershock nodal planes align with faults promoted towards failure by co-seismic slip. When epicentral uncertainties are considered (on the order of just ±2–3 km), 90 per cent of aftershocks are consistent with occurring along faults demonstrating positive stress transfer. These results imply large sensitivities of Coulomb stress transfer calculations to uncertainties in both earthquake locations and models of slip distributions, particularly when applied to aftershocks close to a heterogeneous fault rupture; such uncertainties should therefore be considered in similar studies used to argue for or against models of static stress triggering.

  18. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  19. Constraints on the Early Terrestrial Surface UV Environment Relevant to Prebiotic Chemistry.

    PubMed

    Ranjan, Sukrit; Sasselov, Dimitar D

    2017-03-01

    The UV environment is a key boundary condition to abiogenesis. However, considerable uncertainty exists as to planetary conditions and hence surface UV at abiogenesis. Here, we present two-stream multilayer clear-sky calculations of the UV surface radiance on Earth at 3.9 Ga to constrain the UV surface fluence as a function of albedo, solar zenith angle (SZA), and atmospheric composition. Variation in albedo and latitude (through SZA) can affect maximum photoreaction rates by a factor of >10.4; for the same atmosphere, photoreactions can proceed an order of magnitude faster at the equator of a snowball Earth than at the poles of a warmer world. Hence, surface conditions are important considerations when computing prebiotic UV fluences. For climatically reasonable levels of CO 2 , fluence shortward of 189 nm is screened out, meaning that prebiotic chemistry is robustly shielded from variations in UV fluence due to solar flares or variability. Strong shielding from CO 2 also means that the UV surface fluence is insensitive to plausible levels of CH 4 , O 2 , and O 3 . At scattering wavelengths, UV fluence drops off comparatively slowly with increasing CO 2 levels. However, if SO 2 and/or H 2 S can build up to the ≥1-100 ppm level as hypothesized by some workers, then they can dramatically suppress surface fluence and hence prebiotic photoprocesses. H 2 O is a robust UV shield for λ < 198 nm. This means that regardless of the levels of other atmospheric gases, fluence ≲198 nm is only available for cold, dry atmospheres, meaning sources with emission ≲198 (e.g., ArF excimer lasers) can only be used in simulations of cold environments with low abundance of volcanogenic gases. On the other hand, fluence at 254 nm is unshielded by H 2 O and is available across a broad range of [Formula: see text], meaning that mercury lamps are suitable for initial studies regardless of the uncertainty in primordial H 2 O and CO 2 levels. Key Words: Radiative transfer-Origin of life-Planetary environments-UV radiation-Prebiotic chemistry. Astrobiology 17, 169-204.

  20. Constraints on the Early Terrestrial Surface UV Environment Relevant to Prebiotic Chemistry

    NASA Astrophysics Data System (ADS)

    Ranjan, Sukrit; Sasselov, Dimitar D.

    2017-03-01

    The UV environment is a key boundary condition to abiogenesis. However, considerable uncertainty exists as to planetary conditions and hence surface UV at abiogenesis. Here, we present two-stream multilayer clear-sky calculations of the UV surface radiance on Earth at 3.9 Ga to constrain the UV surface fluence as a function of albedo, solar zenith angle (SZA), and atmospheric composition. Variation in albedo and latitude (through SZA) can affect maximum photoreaction rates by a factor of >10.4; for the same atmosphere, photoreactions can proceed an order of magnitude faster at the equator of a snowball Earth than at the poles of a warmer world. Hence, surface conditions are important considerations when computing prebiotic UV fluences. For climatically reasonable levels of CO2, fluence shortward of 189 nm is screened out, meaning that prebiotic chemistry is robustly shielded from variations in UV fluence due to solar flares or variability. Strong shielding from CO2 also means that the UV surface fluence is insensitive to plausible levels of CH4, O2, and O3. At scattering wavelengths, UV fluence drops off comparatively slowly with increasing CO2 levels. However, if SO2 and/or H2S can build up to the ≥1-100 ppm level as hypothesized by some workers, then they can dramatically suppress surface fluence and hence prebiotic photoprocesses. H2O is a robust UV shield for λ < 198 nm. This means that regardless of the levels of other atmospheric gases, fluence ≲198 nm is only available for cold, dry atmospheres, meaning sources with emission ≲198 (e.g., ArF excimer lasers) can only be used in simulations of cold environments with low abundance of volcanogenic gases. On the other hand, fluence at 254 nm is unshielded by H2O and is available across a broad range of NCO2, meaning that mercury lamps are suitable for initial studies regardless of the uncertainty in primordial H2O and CO2 levels.

  1. Topology optimization under stochastic stiffness

    NASA Astrophysics Data System (ADS)

    Asadpoure, Alireza

    Topology optimization is a systematic computational tool for optimizing the layout of materials within a domain for engineering design problems. It allows variation of structural boundaries and connectivities. This freedom in the design space often enables discovery of new, high performance designs. However, solutions obtained by performing the optimization in a deterministic setting may be impractical or suboptimal when considering real-world engineering conditions with inherent variabilities including (for example) variabilities in fabrication processes and operating conditions. The aim of this work is to provide a computational methodology for topology optimization in the presence of uncertainties associated with structural stiffness, such as uncertain material properties and/or structural geometry. Existing methods for topology optimization under deterministic conditions are first reviewed. Modifications are then proposed to improve the numerical performance of the so-called Heaviside Projection Method (HPM) in continuum domains. Next, two approaches, perturbation and Polynomial Chaos Expansion (PCE), are proposed to account for uncertainties in the optimization procedure. These approaches are intrusive, allowing tight and efficient coupling of the uncertainty quantification with the optimization sensitivity analysis. The work herein develops a robust topology optimization framework aimed at reducing the sensitivity of optimized solutions to uncertainties. The perturbation-based approach combines deterministic topology optimization with a perturbation method for the quantification of uncertainties. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. The PCE approach combines the spectral stochastic approach for the representation and propagation of uncertainties with an existing deterministic topology optimization technique. The resulting compact representations for the response quantities allow for efficient and accurate calculation of sensitivities of response statistics with respect to the design variables. The proposed methods are shown to be successful at generating robust optimal topologies. Examples from topology optimization in continuum and discrete domains (truss structures) under uncertainty are presented. It is also shown that proposed methods lead to significant computational savings when compared to Monte Carlo-based optimization which involve multiple formations and inversions of the global stiffness matrix and that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.

  2. Capacity planning in a transitional economy: What issues? Which models?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubayi, V.; Leigh, R.W.; Bright, R.N.

    1996-03-01

    This paper is devoted to an exploration of the important issues facing the Russian power generation system and its evolution in the foreseeable future and the kinds of modeling approaches that capture those issues. These issues include, for example, (1) trade-offs between investments in upgrading and refurbishment of existing thermal (fossil-fired) capacity and safety enhancements in existing nuclear capacity versus investment in new capacity, (2) trade-offs between investment in completing unfinished (under construction) projects based on their original design versus investment in new capacity with improved design, (3) incorporation of demand-side management options (investments in enhancing end-use efficiency, for example)more » within the planning framework, (4) consideration of the spatial dimensions of system planning including investments in upgrading electric transmission networks or fuel shipment networks and incorporating hydroelectric generation, (5) incorporation of environmental constraints and (6) assessment of uncertainty and evaluation of downside risk. Models for exploring these issues include low power shutdown (LPS) which are computationally very efficient, though approximate, and can be used to perform extensive sensitivity analyses to more complex models which can provide more detailed answers but are computationally cumbersome and can only deal with limited issues. The paper discusses which models can usefully treat a wide range of issues within the priorities facing decision makers in the Russian power sector and integrate the results with investment decisions in the wider economy.« less

  3. Integrating asthma hazard characterization methods for consumer products.

    PubMed

    Maier, A; Vincent, M J; Gadagbui, B; Patterson, J; Beckett, W; Dalton, P; Kimber, I; Selgrade, M J K

    2014-10-01

    Despite extensive study, definitive conclusions regarding the relationship between asthma and consumer products remain elusive. Uncertainties reflect the multi-faceted nature of asthma (i.e., contributions of immunologic and non-immunologic mechanisms). Many substances used in consumer products are associated with occupational asthma or asthma-like syndromes. However, risk assessment methods do not adequately predict the potential for consumer product exposures to trigger asthma and related syndromes under lower-level end-user conditions. A decision tree system is required to characterize asthma and respiratory-related hazards associated with consumer products. A system can be built to incorporate the best features of existing guidance, frameworks, and models using a weight-of-evidence (WoE) approach. With this goal in mind, we have evaluated chemical hazard characterization methods for asthma and asthma-like responses. Despite the wealth of information available, current hazard characterization methods do not definitively identify whether a particular ingredient will cause or exacerbate asthma, asthma-like responses, or sensitization of the respiratory tract at lower levels associated with consumer product use. Effective use of hierarchical lines of evidence relies on consideration of the relevance and potency of assays, organization of assays by mode of action, and better assay validation. It is anticipated that the analysis of existing methods will support the development of a refined WoE approach. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  4. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  5. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  6. The Harm that Underestimation of Uncertainty Does to Our Community: A Case Study Using Sunspot Area Measurements

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, Andres

    2017-08-01

    Data products in heliospheric physics are very often provided without clear estimates of uncertainty. From helioseismology in the solar interior, all the way to in situ solar wind measurements beyond 1AU, uncertainty estimates are typically hard for users to find (buried inside long documents that are separate from the data products), or simply non-existent.There are two main reasons why uncertainty measurements are hard to find:1. Understanding instrumental systematic errors is given a much higher priority inside instrumental teams.2. The desire to perfectly understand all sources of uncertainty postpones indefinitely the actual quantification of uncertainty in our measurements.Using the cross calibration of 200 years of sunspot area measurements as a case study, in this presentation we will discuss the negative impact that inadequate measurements of uncertainty have on users, through the appearance of toxic and unnecessary controversies, and data providers, through the creation of unrealistic expectations regarding the information that can be extracted from their data. We will discuss how empirical estimates of uncertainty represent a very good alternative to not providing any estimates at all, and finalize by discussing the bare essentials that should become our standard practice for future instruments and surveys.

  7. Avionics Integrity Program (AVIP). Volume 4. Force Management - Economic Life Considerations.

    DTIC Science & Technology

    1984-03-01

    often used to refer to the period of time during which financial considerations justify the continued use of an existing system. The study addresses...rejporLs incl-ude contractor ei forts between September 1983 and Marih 1984. Each report represents a completed study in a specific area and stands alone...during which financial considerations justify the selection or continued use of a system. A variety of definitions currently exist for economic life. One

  8. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  9. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  10. Informative Bayesian Type A uncertainty evaluation, especially applicable to a small number of observations

    NASA Astrophysics Data System (ADS)

    Cox, M.; Shirono, K.

    2017-10-01

    A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.

  11. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  12. Uncertainty Analysis on Heat Transfer Correlations for RP-1 Fuel in Copper Tubing

    NASA Technical Reports Server (NTRS)

    Driscoll, E. A.; Landrum, D. B.

    2004-01-01

    NASA is studying kerosene (RP-1) for application in Next Generation Launch Technology (NGLT). Accurate heat transfer correlations in narrow passages at high temperatures and pressures are needed. Hydrocarbon fuels, such as RP-1, produce carbon deposition (coke) along the inside of tube walls when heated to high temperatures. A series of tests to measure the heat transfer using RP-1 fuel and examine the coking were performed in NASA Glenn Research Center's Heated Tube Facility. The facility models regenerative cooling by flowing room temperature RP-1 through resistively heated copper tubing. A Regression analysis is performed on the data to determine the heat transfer correlation for Nusselt number as a function of Reynolds and Prandtl numbers. Each measurement and calculation is analyzed to identify sources of uncertainty, including RP-1 property variations. Monte Carlo simulation is used to determine how each uncertainty source propagates through the regression and an overall uncertainty in predicted heat transfer coefficient. The implications of these uncertainties on engine design and ways to minimize existing uncertainties are discussed.

  13. Material Issues of Blanket Systems for Fusion Reactors - Compatibility with Cooling Water -

    NASA Astrophysics Data System (ADS)

    Miwa, Yukio; Tsukada, Takashi; Jitsukawa, Shiro

    Environmental assisted cracking (EAC) is one of the material issues for the reactor core components of light water power reactors(LWRs). Much experience and knowledge have been obtained about the EAC in the LWR field. They will be useful to prevent the EAC of water-cooled blanket systems of fusion reactors. For the austenitic stainless steels and the reduced-activation ferritic/martensitic steels, they clarifies that the EAC in a water-cooled blanket does not seem to be acritical issue. However, some uncertainties about influences on water temperatures, water chemistries and stress conditions may affect on the EAC. Considerations and further investigations elucidating the uncertainties are discussed.

  14. Evaluation of risk from acts of terrorism :the adversary/defender model using belief and fuzzy sets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, John L.

    Risk from an act of terrorism is a combination of the likelihood of an attack, the likelihood of success of the attack, and the consequences of the attack. The considerable epistemic uncertainty in each of these three factors can be addressed using the belief/plausibility measure of uncertainty from the Dempster/Shafer theory of evidence. The adversary determines the likelihood of the attack. The success of the attack and the consequences of the attack are determined by the security system and mitigation measures put in place by the defender. This report documents a process for evaluating risk of terrorist acts using anmore » adversary/defender model with belief/plausibility as the measure of uncertainty. Also, the adversary model is a linguistic model that applies belief/plausibility to fuzzy sets used in an approximate reasoning rule base.« less

  15. Uncertainty of Videogrammetric Techniques used for Aerodynamic Testing

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Liu, Tianshu; DeLoach, Richard

    2002-01-01

    The uncertainty of videogrammetric techniques used for the measurement of static aeroelastic wind tunnel model deformation and wind tunnel model pitch angle is discussed. Sensitivity analyses and geometrical considerations of uncertainty are augmented by analyses of experimental data in which videogrammetric angle measurements were taken simultaneously with precision servo accelerometers corrected for dynamics. An analysis of variance (ANOVA) to examine error dependence on angle of attack, sensor used (inertial or optical). and on tunnel state variables such as Mach number is presented. Experimental comparisons with a high-accuracy indexing table are presented. Small roll angles are found to introduce a zero-shift in the measured angles. It is shown experimentally that. provided the proper constraints necessary for a solution are met, a single- camera solution can he comparable to a 2-camera intersection result. The relative immunity of optical techniques to dynamics is illustrated.

  16. Accounting for uncertainty in health economic decision models by using model averaging

    PubMed Central

    Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D

    2009-01-01

    Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment. PMID:19381329

  17. Markov logic network based complex event detection under uncertainty

    NASA Astrophysics Data System (ADS)

    Lu, Jingyang; Jia, Bin; Chen, Genshe; Chen, Hua-mei; Sullivan, Nichole; Pham, Khanh; Blasch, Erik

    2018-05-01

    In a cognitive reasoning system, the four-stage Observe-Orient-Decision-Act (OODA) reasoning loop is of interest. The OODA loop is essential for the situational awareness especially in heterogeneous data fusion. Cognitive reasoning for making decisions can take advantage of different formats of information such as symbolic observations, various real-world sensor readings, or the relationship between intelligent modalities. Markov Logic Network (MLN) provides mathematically sound technique in presenting and fusing data at multiple levels of abstraction, and across multiple intelligent sensors to conduct complex decision-making tasks. In this paper, a scenario about vehicle interaction is investigated, in which uncertainty is taken into consideration as no systematic approaches can perfectly characterize the complex event scenario. MLNs are applied to the terrestrial domain where the dynamic features and relationships among vehicles are captured through multiple sensors and information sources regarding the data uncertainty.

  18. A cholinergic feedback circuit to regulate striatal population uncertainty and optimize reinforcement learning.

    PubMed

    Franklin, Nicholas T; Frank, Michael J

    2015-12-25

    Convergent evidence suggests that the basal ganglia support reinforcement learning by adjusting action values according to reward prediction errors. However, adaptive behavior in stochastic environments requires the consideration of uncertainty to dynamically adjust the learning rate. We consider how cholinergic tonically active interneurons (TANs) may endow the striatum with such a mechanism in computational models spanning three Marr's levels of analysis. In the neural model, TANs modulate the excitability of spiny neurons, their population response to reinforcement, and hence the effective learning rate. Long TAN pauses facilitated robustness to spurious outcomes by increasing divergence in synaptic weights between neurons coding for alternative action values, whereas short TAN pauses facilitated stochastic behavior but increased responsiveness to change-points in outcome contingencies. A feedback control system allowed TAN pauses to be dynamically modulated by uncertainty across the spiny neuron population, allowing the system to self-tune and optimize performance across stochastic environments.

  19. An inexact multistage fuzzy-stochastic programming for regional electric power system management constrained by environmental quality.

    PubMed

    Fu, Zhenghui; Wang, Han; Lu, Wentao; Guo, Huaicheng; Li, Wei

    2017-12-01

    Electric power system involves different fields and disciplines which addressed the economic system, energy system, and environment system. Inner uncertainty of this compound system would be an inevitable problem. Therefore, an inexact multistage fuzzy-stochastic programming (IMFSP) was developed for regional electric power system management constrained by environmental quality. A model which concluded interval-parameter programming, multistage stochastic programming, and fuzzy probability distribution was built to reflect the uncertain information and dynamic variation in the case study, and the scenarios under different credibility degrees were considered. For all scenarios under consideration, corrective actions were allowed to be taken dynamically in accordance with the pre-regulated policies and the uncertainties in reality. The results suggest that the methodology is applicable to handle the uncertainty of regional electric power management systems and help the decision makers to establish an effective development plan.

  20. Reconstructing signals from noisy data with unknown signal and noise covariance.

    PubMed

    Oppermann, Niels; Robbers, Georg; Ensslin, Torsten A

    2011-10-01

    We derive a method to reconstruct Gaussian signals from linear measurements with Gaussian noise. This new algorithm is intended for applications in astrophysics and other sciences. The starting point of our considerations is the principle of minimum Gibbs free energy, which was previously used to derive a signal reconstruction algorithm handling uncertainties in the signal covariance. We extend this algorithm to simultaneously uncertain noise and signal covariances using the same principles in the derivation. The resulting equations are general enough to be applied in many different contexts. We demonstrate the performance of the algorithm by applying it to specific example situations and compare it to algorithms not allowing for uncertainties in the noise covariance. The results show that the method we suggest performs very well under a variety of circumstances and is indeed qualitatively superior to the other methods in cases where uncertainty in the noise covariance is present.

Top