Sample records for uncertainty reduction strategies

  1. Uncertainty in the Work-Place: Hierarchical Differences of Uncertainty Levels and Reduction Strategies.

    ERIC Educational Resources Information Center

    Petelle, John L.; And Others

    A study examined the uncertainty levels and types reported by supervisors and employees at three hierarchical levels of an organization: first-line supervisors, full-time employees, and part-time employees. It investigated differences in uncertainty-reduction strategies employed by these three hierarchical groups. The 61 subjects who completed…

  2. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  3. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  4. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  5. Launcher Systems Development Cost: Behavior, Uncertainty, Influences, Barriers and Strategies for Reduction

    NASA Technical Reports Server (NTRS)

    Shaw, Eric J.

    2001-01-01

    This paper will report on the activities of the IAA Launcher Systems Economics Working Group in preparations for its Launcher Systems Development Cost Behavior Study. The Study goals include: improve launcher system and other space system parametric cost analysis accuracy; improve launcher system and other space system cost analysis credibility; and provide launcher system and technology development program managers and other decisionmakers with useful information on development cost impacts of their decisions. The Working Group plans to explore at least the following five areas in the Study: define and explain development cost behavior terms and concepts for use in the Study; identify and quantify sources of development cost and cost estimating uncertainty; identify and quantify significant influences on development cost behavior; identify common barriers to development cost understanding and reduction; and recommend practical, realistic strategies to accomplish reductions in launcher system development cost.

  6. Aerodynamic design of electric and hybrid vehicles: A guidebook

    NASA Technical Reports Server (NTRS)

    Kurtz, D. W.

    1980-01-01

    A typical present-day subcompact electric hybrid vehicle (EHV), operating on an SAE J227a D driving cycle, consumes up to 35% of its road energy requirement overcoming aerodynamic resistance. The application of an integrated system design approach, where drag reduction is an important design parameter, can increase the cycle range by more than 15%. This guidebook highlights a logic strategy for including aerodynamic drag reduction in the design of electric and hybrid vehicles to the degree appropriate to the mission requirements. Backup information and procedures are included in order to implement the strategy. Elements of the procedure are based on extensive wind tunnel tests involving generic subscale models and full-scale prototype EHVs. The user need not have any previous aerodynamic background. By necessity, the procedure utilizes many generic approximations and assumptions resulting in various levels of uncertainty. Dealing with these uncertainties, however, is a key feature of the strategy.

  7. Romantic relationship stages and social networking sites: uncertainty reduction strategies and perceived relational norms on facebook.

    PubMed

    Fox, Jesse; Anderegg, Courtney

    2014-11-01

    Due to their pervasiveness and unique affordances, social media play a distinct role in the development of modern romantic relationships. This study examines how a social networking site is used for information seeking about a potential or current romantic partner. In a survey, Facebook users (N=517) were presented with Facebook behaviors categorized as passive (e.g., reading a partner's profile), active (e.g., "friending" a common third party), or interactive (e.g., commenting on the partner's wall) uncertainty reduction strategies. Participants reported how normative they perceived these behaviors to be during four possible stages of relationship development (before meeting face-to-face, after meeting face-to-face, casual dating, and exclusive dating). Results indicated that as relationships progress, perceived norms for these behaviors change. Sex differences were also observed, as women perceived passive and interactive strategies as more normative than men during certain relationship stages.

  8. Evaluation strategies and uncertainty calculation of isotope amount ratios measured by MC ICP-MS on the example of Sr.

    PubMed

    Horsky, Monika; Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    This paper critically reviews the state-of-the-art of isotope amount ratio measurements by solution-based multi-collector inductively coupled plasma mass spectrometry (MC ICP-MS) and presents guidelines for corresponding data reduction strategies and uncertainty assessments based on the example of n((87)Sr)/n((86)Sr) isotope ratios. This ratio shows variation attributable to natural radiogenic processes and mass-dependent fractionation. The applied calibration strategies can display these differences. In addition, a proper statement of uncertainty of measurement, including all relevant influence quantities, is a metrological prerequisite. A detailed instructive procedure for the calculation of combined uncertainties is presented for Sr isotope amount ratios using three different strategies of correction for instrumental isotopic fractionation (IIF): traditional internal correction, standard-sample bracketing, and a combination of both, using Zr as internal standard. Uncertainties are quantified by means of a Kragten spreadsheet approach, including the consideration of correlations between individual input parameters to the model equation. The resulting uncertainties are compared with uncertainties obtained from the partial derivatives approach and Monte Carlo propagation of distributions. We obtain relative expanded uncertainties (U rel; k = 2) of n((87)Sr)/n((86)Sr) of < 0.03 %, when normalization values are not propagated. A comprehensive propagation, including certified values and the internal normalization ratio in nature, increases relative expanded uncertainties by about factor two and the correction for IIF becomes the major contributor.

  9. Social network profiles as information sources for adolescents' offline relations.

    PubMed

    Courtois, Cédric; All, Anissa; Vanwynsberghe, Hadewijch

    2012-06-01

    This article presents the results of a study concerning the use of online profile pages by adolescents to know more about "offline" friends and acquaintances. Previous research has indicated that social networking sites (SNSs) are used to gather information on new online contacts. However, several studies have demonstrated a substantial overlap between offline and online social networks. Hence, we question whether online connections are meaningful in gathering information on offline friends and acquaintances. First, the results indicate that a combination of passive uncertainty reduction (monitoring a target's profile) and interactive uncertainty reduction (communication through the target's profile) explains a considerable amount of variance in the level of uncertainty about both friends and acquaintances. More specifically, adolescents generally get to know much more about their acquaintances. Second, the results of online uncertainty reduction positively affect the degree of self-disclosure, which is imperative in building a solid friend relation. Further, we find that uncertainty reduction strategies positively mediate the effect of social anxiety on the level of certainty about friends. This implies that socially anxious teenagers benefit from SNSs by getting the conditions right to build a more solid relation with their friends. Hence, we conclude that SNSs play a substantial role in today's adolescents' everyday interpersonal communication.

  10. Development of robust building energy demand-side control strategy under uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Sean Hay

    The potential of carbon emission regulations applied to an individual building will encourage building owners to purchase utility-provided green power or to employ onsite renewable energy generation. As both cases are based on intermittent renewable energy sources, demand side control is a fundamental precondition for maximizing the effectiveness of using renewable energy sources. Such control leads to a reduction in peak demand and/or in energy demand variability, therefore, such reduction in the demand profile eventually enhances the efficiency of an erratic supply of renewable energy. The combined operation of active thermal energy storage and passive building thermal mass has shown substantial improvement in demand-side control performance when compared to current state-of-the-art demand-side control measures. Specifically, "model-based" optimal control for this operation has the potential to significantly increase performance and bring economic advantages. However, due to the uncertainty in certain operating conditions in the field its control effectiveness could be diminished and/or seriously damaged, which results in poor performance. This dissertation pursues improvements of current demand-side controls under uncertainty by proposing a robust supervisory demand-side control strategy that is designed to be immune from uncertainty and perform consistently under uncertain conditions. Uniqueness and superiority of the proposed robust demand-side controls are found as below: a. It is developed based on fundamental studies about uncertainty and a systematic approach to uncertainty analysis. b. It reduces variability of performance under varied conditions, and thus avoids the worst case scenario. c. It is reactive in cases of critical "discrepancies" observed caused by the unpredictable uncertainty that typically scenario uncertainty imposes, and thus it increases control efficiency. This is obtainable by means of i) multi-source composition of weather forecasts including both historical archive and online sources and ii) adaptive Multiple model-based controls (MMC) to mitigate detrimental impacts of varying scenario uncertainties. The proposed robust demand-side control strategy verifies its outstanding demand-side control performance in varied and non-indigenous conditions compared to the existing control strategies including deterministic optimal controls. This result reemphasizes importance of the demand-side control for a building in the global carbon economy. It also demonstrates a capability of risk management of the proposed robust demand-side controls in highly uncertain situations, which eventually attains the maximum benefit in both theoretical and practical perspectives.

  11. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    PubMed

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  12. Predicting future uncertainty constraints on global warming projections

    DOE PAGES

    Shiogama, H.; Stone, D.; Emori, S.; ...

    2016-01-11

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  13. Predicting future uncertainty constraints on global warming projections

    PubMed Central

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  14. Predicting future uncertainty constraints on global warming projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiogama, H.; Stone, D.; Emori, S.

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  15. Evolution of motion uncertainty in rectal cancer: implications for adaptive radiotherapy

    NASA Astrophysics Data System (ADS)

    Kleijnen, Jean-Paul J. E.; van Asselen, Bram; Burbach, Johannes P. M.; Intven, Martijn; Philippens, Marielle E. P.; Reerink, Onne; Lagendijk, Jan J. W.; Raaymakers, Bas W.

    2016-01-01

    Reduction of motion uncertainty by applying adaptive radiotherapy strategies depends largely on the temporal behavior of this motion. To fully optimize adaptive strategies, insight into target motion is needed. The purpose of this study was to analyze stability and evolution in time of motion uncertainty of both the gross tumor volume (GTV) and clinical target volume (CTV) for patients with rectal cancer. We scanned 16 patients daily during one week, on a 1.5 T MRI scanner in treatment position, prior to each radiotherapy fraction. Single slice sagittal cine MRIs were made at the beginning, middle, and end of each scan session, for one minute at 2 Hz temporal resolution. GTV and CTV motion were determined by registering a delineated reference frame to time-points later in time. The 95th percentile of observed motion (dist95%) was taken as a measure of motion. The stability of motion in time was evaluated within each cine-MRI separately. The evolution of motion was investigated between the reference frame and the cine-MRIs of a single scan session and between the reference frame and the cine-MRIs of several days later in the course of treatment. This observed motion was then converted into a PTV-margin estimate. Within a one minute cine-MRI scan, motion was found to be stable and small. Independent of the time-point within the scan session, the average dist95% remains below 3.6 mm and 2.3 mm for CTV and GTV, respectively 90% of the time. We found similar motion over time intervals from 18 min to 4 days. When reducing the time interval from 18 min to 1 min, a large reduction in motion uncertainty is observed. A reduction in motion uncertainty, and thus the PTV-margin estimate, of 71% and 75% for CTV and tumor was observed, respectively. Time intervals of 15 and 30 s yield no further reduction in motion uncertainty compared to a 1 min time interval.

  16. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.

  17. The efficiency of asset management strategies to reduce urban flood risk.

    PubMed

    ten Veldhuis, J A E; Clemens, F H L R

    2011-01-01

    In this study, three asset management strategies were compared with respect to their efficiency to reduce flood risk. Data from call centres at two municipalities were used to quantify urban flood risks associated with three causes of urban flooding: gully pot blockage, sewer pipe blockage and sewer overloading. The efficiency of three flood reduction strategies was assessed based on their effect on the causes contributing to flood risk. The sensitivity of the results to uncertainty in the data source, citizens' calls, was analysed through incorporation of uncertainty ranges taken from customer complaint literature. Based on the available data it could be shown that increasing gully pot blockage is the most efficient action to reduce flood risk, given data uncertainty. If differences between cause incidences are large, as in the presented case study, call data are sufficient to decide how flood risk can be most efficiently reduced. According to the results of this analysis, enlargement of sewer pipes is not an efficient strategy to reduce flood risk, because flood risk associated with sewer overloading is small compared to other failure mechanisms.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, L; Soldner, A; Kirk, M

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less

  19. Evaluation of Cost Leadership Strategy in Shipping Enterprises with Simulation Model

    NASA Astrophysics Data System (ADS)

    Ferfeli, Maria V.; Vaxevanou, Anthi Z.; Damianos, Sakas P.

    2009-08-01

    The present study will attempt the evaluation of cost leadership strategy that prevails in certain shipping enterprises and the creation of simulation models based on strategic model STAIR. The above model is an alternative method of strategic applications evaluation. This is held in order to be realised if the strategy of cost leadership creates competitive advantage [1] and this will be achieved via the technical simulation which appreciates the interactions between the operations of an enterprise and the decision-making strategy in conditions of uncertainty with reduction of undertaken risk.

  20. Two-agent cooperative search using game models with endurance-time constraints

    NASA Astrophysics Data System (ADS)

    Sujit, P. B.; Ghose, Debasish

    2010-07-01

    In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.

  1. The concept of comparative information yield curves and its application to risk-based site characterization

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Rubin, Yoram; Maxwell, Reed M.

    2009-06-01

    Defining rational and effective hydrogeological data acquisition strategies is of crucial importance as such efforts are always resource limited. Usually, strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on uncertainty. This paper presents an approach for determining site characterization needs on the basis of human health risk. The main challenge is in striking a balance between reduction in uncertainty in hydrogeological, behavioral, and physiological parameters. Striking this balance can provide clear guidance on setting priorities for data acquisition and for better estimating adverse health effects in humans. This paper addresses this challenge through theoretical developments and numerical simulation. A wide range of factors that affect site characterization needs are investigated, including the dimensions of the contaminant plume and additional length scales that characterize the transport problem, as well as the model of human health risk. The concept of comparative information yield curves is used for investigating the relative impact of hydrogeological and physiological parameters in risk. Results show that characterization needs are dependent on the ratios between flow and transport scales within a risk-driven approach. Additionally, the results indicate that human health risk becomes less sensitive to hydrogeological measurements for large plumes. This indicates that under near-ergodic conditions, uncertainty reduction in human health risk may benefit from better understanding of the physiological component as opposed to a more detailed hydrogeological characterization.

  2. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Experiences of Uncertainty in Men With an Elevated PSA

    PubMed Central

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2016-01-01

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men’s reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. PMID:25979635

  4. Carbon Monitoring System Flux Estimation and Attribution: Impact of ACOS-GOSAT X(CO2) Sampling on the Inference of Terrestrial Biospheric Sources and Sinks

    NASA Technical Reports Server (NTRS)

    Liu, Junjie; Bowman, Kevin W.; Lee, Memong; Henze, David K.; Bousserez, Nicolas; Brix, Holger; Collatz, G. James; Menemenlis, Dimitris; Ott, Lesley; Pawson, Steven; hide

    2014-01-01

    Using an Observing System Simulation Experiment (OSSE), we investigate the impact of JAXA Greenhouse gases Observing SATellite 'IBUKI' (GOSAT) sampling on the estimation of terrestrial biospheric flux with the NASA Carbon Monitoring System Flux (CMS-Flux) estimation and attribution strategy. The simulated observations in the OSSE use the actual column carbon dioxide (X(CO2)) b2.9 retrieval sensitivity and quality control for the year 2010 processed through the Atmospheric CO2 Observations from Space algorithm. CMS-Flux is a variational inversion system that uses the GEOS-Chem forward and adjoint model forced by a suite of observationally constrained fluxes from ocean, land and anthropogenic models. We investigate the impact of GOSAT sampling on flux estimation in two aspects: 1) random error uncertainty reduction and 2) the global and regional bias in posterior flux resulted from the spatiotemporally biased GOSAT sampling. Based on Monte Carlo calculations, we find that global average flux uncertainty reduction ranges from 25% in September to 60% in July. When aggregated to the 11 land regions designated by the phase 3 of the Atmospheric Tracer Transport Model Intercomparison Project, the annual mean uncertainty reduction ranges from 10% over North American boreal to 38% over South American temperate, which is driven by observational coverage and the magnitude of prior flux uncertainty. The uncertainty reduction over the South American tropical region is 30%, even with sparse observation coverage. We show that this reduction results from the large prior flux uncertainty and the impact of non-local observations. Given the assumed prior error statistics, the degree of freedom for signal is approx.1132 for 1-yr of the 74 055 GOSAT X(CO2) observations, which indicates that GOSAT provides approx.1132 independent pieces of information about surface fluxes. We quantify the impact of GOSAT's spatiotemporally sampling on the posterior flux, and find that a 0.7 gigatons of carbon bias in the global annual posterior flux resulted from the seasonally and diurnally biased sampling when using a diagonal prior flux error covariance.

  5. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    NASA Astrophysics Data System (ADS)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2014-07-01

    The uncertainty brought about by intermittent volcanic activity is fairly common at volcanoes worldwide. While better knowledge of any one volcano's behavioural characteristics has the potential to reduce this uncertainty, the subsequent reduction of risk from volcanic threats is only realised if that knowledge is pertinent to stakeholders and effectively communicated to inform good decision making. Success requires integration of methods, skills and expertise across disciplinary boundaries. This research project develops and trials a novel interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). For the first time, volcanological techniques, probabilistic decision support and social scientific methods were integrated in a single study. New data were produced that (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience; and (5) evaluated the effectiveness of a scenario planning approach, both as a method for integrating the different strands of the research and as a way of enabling on-island decision makers to take ownership of risk identification and management, and capacity building within their community. The paper provides empirical evidence of the value of an innovative interdisciplinary framework for reducing volcanic risk. It also provides evidence for the strength that comes from integrating social and physical sciences with the development of effective, tailored engagement and communication strategies in volcanic risk reduction.

  6. Adaptive Management and the Value of Information: Learning Via Intervention in Epidemiology

    PubMed Central

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding. PMID:25333371

  7. Adaptive management and the value of information: learning via intervention in epidemiology

    USGS Publications Warehouse

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding.

  8. The impact of uncertainty on optimal emission policies

    NASA Astrophysics Data System (ADS)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  9. Salt Reduction Initiatives around the World - A Systematic Review of Progress towards the Global Target.

    PubMed

    Trieu, Kathy; Neal, Bruce; Hawkes, Corinna; Dunford, Elizabeth; Campbell, Norm; Rodriguez-Fernandez, Rodrigo; Legetic, Branka; McLaren, Lindsay; Barberio, Amanda; Webster, Jacqui

    2015-01-01

    To quantify progress with the initiation of salt reduction strategies around the world in the context of the global target to reduce population salt intake by 30% by 2025. A systematic review of the published and grey literature was supplemented by questionnaires sent to country program leaders. Core characteristics of strategies were extracted and categorised according to a pre-defined framework. A total of 75 countries now have a national salt reduction strategy, more than double the number reported in a similar review done in 2010. The majority of programs are multifaceted and include industry engagement to reformulate products (n = 61), establishment of sodium content targets for foods (39), consumer education (71), front-of-pack labelling schemes (31), taxation on high-salt foods (3) and interventions in public institutions (54). Legislative action related to salt reduction such as mandatory targets, front of pack labelling, food procurement policies and taxation have been implemented in 33 countries. 12 countries have reported reductions in population salt intake, 19 reduced salt content in foods and 6 improvements in consumer knowledge, attitudes or behaviours relating to salt. The large and increasing number of countries with salt reduction strategies in place is encouraging although activity remains limited in low- and middle-income regions. The absence of a consistent approach to implementation highlights uncertainty about the elements most important to success. Rigorous evaluation of ongoing programs and initiation of salt reduction programs, particularly in low- and middle- income countries, will be vital to achieving the targeted 30% reduction in salt intake.

  10. Salt Reduction Initiatives around the World – A Systematic Review of Progress towards the Global Target

    PubMed Central

    Trieu, Kathy; Neal, Bruce; Hawkes, Corinna; Dunford, Elizabeth; Campbell, Norm; Rodriguez-Fernandez, Rodrigo; Legetic, Branka; McLaren, Lindsay; Barberio, Amanda; Webster, Jacqui

    2015-01-01

    Objective To quantify progress with the initiation of salt reduction strategies around the world in the context of the global target to reduce population salt intake by 30% by 2025. Methods A systematic review of the published and grey literature was supplemented by questionnaires sent to country program leaders. Core characteristics of strategies were extracted and categorised according to a pre-defined framework. Results A total of 75 countries now have a national salt reduction strategy, more than double the number reported in a similar review done in 2010. The majority of programs are multifaceted and include industry engagement to reformulate products (n = 61), establishment of sodium content targets for foods (39), consumer education (71), front-of-pack labelling schemes (31), taxation on high-salt foods (3) and interventions in public institutions (54). Legislative action related to salt reduction such as mandatory targets, front of pack labelling, food procurement policies and taxation have been implemented in 33 countries. 12 countries have reported reductions in population salt intake, 19 reduced salt content in foods and 6 improvements in consumer knowledge, attitudes or behaviours relating to salt. Conclusion The large and increasing number of countries with salt reduction strategies in place is encouraging although activity remains limited in low- and middle-income regions. The absence of a consistent approach to implementation highlights uncertainty about the elements most important to success. Rigorous evaluation of ongoing programs and initiation of salt reduction programs, particularly in low- and middle- income countries, will be vital to achieving the targeted 30% reduction in salt intake. PMID:26201031

  11. Experiences of Uncertainty in Men With an Elevated PSA.

    PubMed

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2015-05-15

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.

  12. Sequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta

    2009-07-01

    Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.

  13. Recent advancements in GRACE mascon regularization and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Loomis, B. D.; Luthcke, S. B.

    2017-12-01

    The latest release of the NASA Goddard Space Flight Center (GSFC) global time-variable gravity mascon product applies a new regularization strategy along with new methods for estimating noise and leakage uncertainties. The critical design component of mascon estimation is the construction of the applied regularization matrices, and different strategies exist between the different centers that produce mascon solutions. The new approach from GSFC directly applies the pre-fit Level 1B inter-satellite range-acceleration residuals in the design of time-dependent regularization matrices, which are recomputed at each step of our iterative solution method. We summarize this new approach, demonstrating the simultaneous increase in recovered time-variable gravity signal and reduction in the post-fit inter-satellite residual magnitudes, until solution convergence occurs. We also present our new approach for estimating mascon noise uncertainties, which are calibrated to the post-fit inter-satellite residuals. Lastly, we present a new technique for end users to quickly estimate the signal leakage errors for any selected grouping of mascons, and we test the viability of this leakage assessment procedure on the mascon solutions produced by other processing centers.

  14. Implications of uncertainty on regional CO2 mitigation policies for the U.S. onroad sector based on a high-resolution emissions estimate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendoza, D.; Gurney, Kevin R.; Geethakumar, Sarath

    2013-04-01

    In this study we present onroad fossil fuel CO2 emissions estimated by the Vulcan Project, an effort quantifying fossil fuel CO2 emissions for the U.S. in high spatial and temporal resolution. This high-resolution data, aggregated at the state-level and classified in broad road and vehicle type categories, is compared to a commonly used national-average approach. We find that the use of national averages incurs state-level biases for road groupings that are almost twice as large as for vehicle groupings. The uncertainty for all groups exceeds the bias, and both quantities are positively correlated with total state emissions. States with themore » largest emissions totals are typically similar to one another in terms of emissions fraction distribution across road and vehicle groups, while smaller-emitting states have a wider range of variation in all groups. Errors in reduction estimates as large as ±60% corresponding to ±0.2 MtC are found for a national-average emissions mitigation strategy focused on a 10% emissions reduction from a single vehicle class, such as passenger gas vehicles or heavy diesel trucks. Recommendations are made for reducing CO2 emissions uncertainty by addressing its main drivers: VMT and fuel efficiency uncertainty.« less

  15. Likelihood of achieving air quality targets under model uncertainties.

    PubMed

    Digar, Antara; Cohan, Daniel S; Cox, Dennis D; Kim, Byeong-Uk; Boylan, James W

    2011-01-01

    Regulatory attainment demonstrations in the United States typically apply a bright-line test to predict whether a control strategy is sufficient to attain an air quality standard. Photochemical models are the best tools available to project future pollutant levels and are a critical part of regulatory attainment demonstrations. However, because photochemical models are uncertain and future meteorology is unknowable, future pollutant levels cannot be predicted perfectly and attainment cannot be guaranteed. This paper introduces a computationally efficient methodology for estimating the likelihood that an emission control strategy will achieve an air quality objective in light of uncertainties in photochemical model input parameters (e.g., uncertain emission and reaction rates, deposition velocities, and boundary conditions). The method incorporates Monte Carlo simulations of a reduced form model representing pollutant-precursor response under parametric uncertainty to probabilistically predict the improvement in air quality due to emission control. The method is applied to recent 8-h ozone attainment modeling for Atlanta, Georgia, to assess the likelihood that additional controls would achieve fixed (well-defined) or flexible (due to meteorological variability and uncertain emission trends) targets of air pollution reduction. The results show that in certain instances ranking of the predicted effectiveness of control strategies may differ between probabilistic and deterministic analyses.

  16. Modeling with uncertain science: estimating mitigation credits from abating lead poisoning in Golden Eagles.

    PubMed

    Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A

    2015-09-01

    Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.

  17. Uncertainty in Bioenergy Scenarios for California: Lessons Learned in Communicating with Different Stakeholder Groups

    NASA Astrophysics Data System (ADS)

    Youngs, H.

    2013-12-01

    Projecting future bioenergy use involves incorporating several critical inter-related parameters with high uncertainty. Among these are: technology adoption, infrastructure and capacity building, investment, political will, and public acceptance. How, when, where, and to what extent the various bioenergy options are implemented has profound effects on the environmental impacts incurred. California serves as an interesting case study for bioenergy implementation because it has very strong competing forces that can influence these critical factors. The state has aggressive greenhouse gas reduction goals, which will require some biofuels, and has invested accordingly on new technology. At the same time, political will and public acceptance of bioenergy has wavered, seriously stalling bioenergy expansion efforts. We have constructed scenarios for bioenergy implementation in California to 2050, in conjunction with efforts to reach AB32 GHG reduction goals of 80% below 1990 emissions. The state has the potential to produce 3 to 10 TJ of biofuels and electricity; however, this potential will be severely limited in some scenarios. This work examines sources of uncertainty in bioenergy implementation, how uncertainty is or is not incorporated into future bioenergy scenarios, and what this means for assessing environmental impacts. How uncertainty is communicated and perceived also affects future scenarios. Often, there is a disconnect between scenarios for widespread implementation and the actual development of individual projects, resulting in "artificial uncertainty" with very real impacts. Bringing stakeholders to the table is only the first step. Strategies to tailor and stage discussions of uncertainty to stakeholder groups is equally important. Lessons learned in the process of communicating the Calfornia's Energy Future biofuels assessment will be discussed.

  18. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  19. Comparing the effects of different land management strategies across several land types on California's landscape carbon and associated greenhouse gas budgets

    NASA Astrophysics Data System (ADS)

    Di Vittorio, A. V.; Simmonds, M.; Nico, P. S.

    2017-12-01

    Land-based carbon sequestration and GreenHouse Gas (GHG) reduction strategies are often implemented in small patches and evaluated independently from each other, which poses several challenges to determining their potential benefits at the regional scales at which carbon/GHG targets are defined. These challenges include inconsistent methods, uncertain scalability to larger areas, and lack of constraints such as land ownership and competition among multiple strategies. To address such challenges we have developed an integrated carbon and GHG budget model of California's entire landscape, delineated by geographic region, land type, and ownership. This empirical model has annual time steps and includes net ecosystem carbon exchange, wildfire, multiple forest management practices including wood and bioenergy production, cropland and rangeland soil management, various land type restoration activities, and land cover change. While the absolute estimates vary considerably due to uncertainties in initial carbon densities and ecosystem carbon exchange rates, the estimated effects of particular management activities with respect to baseline are robust across these uncertainties. Uncertainty in land use/cover change data is also critical, as different rates of shrubland to grassland conversion can switch the system from a carbon source to a sink. The results indicate that reducing urban area expansion has substantial and consistent benefits, while the effects of direct land management practices vary and depend largely on the available management area. Increasing forest fuel reduction extent over the baseline contributes to annual GHG costs during increased management, and annual benefits after increased management ceases. Cumulatively, it could take decades to recover the cost of 14 years of increased fuel reduction. However, forest carbon losses can be completely offset within 20 years through increases in urban forest fraction and marsh restoration. Additionally, highly uncertain black carbon estimates dominate the overall GHG budget due to wildfire, forest management, and bioenergy production. Overall, this tool is well suited for exploring suites of management options and extents throughout California in order to quantify potential regional carbon sequestration and GHG emission benefits.

  20. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    NASA Astrophysics Data System (ADS)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2013-12-01

    This research project adopted an interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). New data were produced that: (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience. Despite their isolation and prolonged periods of hardship, islanders have demonstrated an ability to cope with and recover from adverse events. This resilience is likely a function of remoteness, strong kinship ties, bonding social capital, and persistence of shared values and principles established at community inception. While there is good knowledge of the styles of volcanic activity on Tristan, given the high degree of scientific uncertainty about the timing, size and location of future volcanism, a qualitative scenario planning approach was used as a vehicle to convey this information to the islanders. This deliberative, anticipatory method allowed on-island decision makers to take ownership of risk identification, management and capacity building within their community. This paper demonstrates the value of integrating social and physical sciences with development of effective, tailored communication strategies in volcanic risk reduction.

  1. Breaking through the uncertainty ceiling in LA-ICP-MS U-Pb geochronology

    NASA Astrophysics Data System (ADS)

    Horstwood, M.

    2016-12-01

    Sources of systematic uncertainty associated with session-to-session bias are the dominant contributor to the 2% (2s) uncertainty ceiling that currently limits the accuracy of LA-ICP-MS U-Pb geochronology. Sources include differential downhole fractionation (LIEF), `matrix effects' and ablation volume differences, which result in irreproducibility of the same reference material across sessions. Current mitigation methods include correcting for LIEF mathematically, using matrix-matched reference materials, annealing material to reduce or eliminate radiation damage effects and tuning for robust plasma conditions. Reducing the depth and volume of ablation can also mitigate these problems and should contribute to the reduction of the uncertainty ceiling. Reducing analysed volume leads to increased detection efficiency, reduced matrix-effects, eliminates LIEF, obviates ablation rate differences and reduces the likelihood of intercepting complex growth zones with depth, thereby apparently improving material homogeneity. High detection efficiencies (% level) and low sampling volumes (20um box, 1-2um deep) can now be achieved using MC-ICP-MS such that low volume ablations should be considered part of the toolbox of methods targeted at improving the reproducibility of LA-ICP-MS U-Pb geochronology. In combination with other strategies these improvements should be feasible on any ICP platform. However, reducing the volume of analysis reduces detected counts and requires a change of analytical approach in order to mitigate this. Appropriate strategies may include the use of high efficiency cell and torch technologies and the optimisation of acquisition protocols and data handling techniques such as condensing signal peaks, using log ratios and total signal integration. The tools required to break the 2% (2s) uncertainty ceiling in LA-ICP-MS U-Pb geochronology are likely now known but require a coherent strategy and change of approach to combine their implementation and realise this goal. This study will highlight these changes and efforts towards reducing the uncertainty contribution for LA-ICP-MS U-Pb geochronology.

  2. Estimating the Health Effects of Greenhouse Gas Mitigation Strategies: Addressing Parametric, Model, and Valuation Challenges

    PubMed Central

    Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid

    2014-01-01

    Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270

  3. Potential of European 14CO2 observation network to estimate the fossil fuel CO2 emissions via atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Wang, Yilong; Broquet, Grégoire; Ciais, Philippe; Chevallier, Frédéric; Vogel, Felix; Wu, Lin; Yin, Yi; Wang, Rong; Tao, Shu

    2018-03-01

    Combining measurements of atmospheric CO2 and its radiocarbon (14CO2) fraction and transport modeling in atmospheric inversions offers a way to derive improved estimates of CO2 emitted from fossil fuel (FFCO2). In this study, we solve for the monthly FFCO2 emission budgets at regional scale (i.e., the size of a medium-sized country in Europe) and investigate the performance of different observation networks and sampling strategies across Europe. The inversion system is built on the LMDZv4 global transport model at 3.75° × 2.5° resolution. We conduct Observing System Simulation Experiments (OSSEs) and use two types of diagnostics to assess the potential of the observation and inverse modeling frameworks. The first one relies on the theoretical computation of the uncertainty in the estimate of emissions from the inversion, known as posterior uncertainty, and on the uncertainty reduction compared to the uncertainty in the inventories of these emissions, which are used as a prior knowledge by the inversion (called prior uncertainty). The second one is based on comparisons of prior and posterior estimates of the emission to synthetic true emissions when these true emissions are used beforehand to generate the synthetic fossil fuel CO2 mixing ratio measurements that are assimilated in the inversion. With 17 stations currently measuring 14CO2 across Europe using 2-week integrated sampling, the uncertainty reduction for monthly FFCO2 emissions in a country where the network is rather dense like Germany, is larger than 30 %. With the 43 14CO2 measurement stations planned in Europe, the uncertainty reduction for monthly FFCO2 emissions is increased for the UK, France, Italy, eastern Europe and the Balkans, depending on the configuration of prior uncertainty. Further increasing the number of stations or the sampling frequency improves the uncertainty reduction (up to 40 to 70 %) in high emitting regions, but the performance of the inversion remains limited over low-emitting regions, even assuming a dense observation network covering the whole of Europe. This study also shows that both the theoretical uncertainty reduction (and resulting posterior uncertainty) from the inversion and the posterior estimate of emissions itself, for a given prior and true estimate of the emissions, are highly sensitive to the choice between two configurations of the prior uncertainty derived from the general estimate by inventory compilers or computations on existing inventories. In particular, when the configuration of the prior uncertainty statistics in the inversion system does not match the difference between these prior and true estimates, the posterior estimate of emissions deviates significantly from the truth. This highlights the difficulty of filtering the targeted signal in the model-data misfit for this specific inversion framework, the need to strongly rely on the prior uncertainty characterization for this and, consequently, the need for improved estimates of the uncertainties in current emission inventories for real applications with actual data. We apply the posterior uncertainty in annual emissions to the problem of detecting a trend of FFCO2, showing that increasing the monitoring period (e.g., more than 20 years) is more efficient than reducing uncertainty in annual emissions by adding stations. The coarse spatial resolution of the atmospheric transport model used in this OSSE (typical of models used for global inversions of natural CO2 fluxes) leads to large representation errors (related to the inability of the transport model to capture the spatial variability of the actual fluxes and mixing ratios at subgrid scales), which is a key limitation of our OSSE setup to improve the accuracy of the monitoring of FFCO2 emissions in European regions. Using a high-resolution transport model should improve the potential to retrieve FFCO2 emissions, and this needs to be investigated.

  4. Advanced Variance Reduction Strategies for Optimizing Mesh Tallies in MAVRIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Blakeman, Edward D; Wagner, John C

    2007-01-01

    More often than in the past, Monte Carlo methods are being used to compute fluxes or doses over large areas using mesh tallies (a set of region tallies defined on a mesh that overlays the geometry). For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas. The CADIS (Consistent Adjoint Driven Importance Sampling) methodology has been shown to very efficientlymore » optimize the calculation of a response (flux or dose) for a single point or a small region using weight windows and a biased source based on the adjoint of that response. This has been incorporated into codes such as ADVANTG (based on MCNP) and the new sequence MAVRIC, which will be available in the next release of SCALE. In an effort to compute lower uncertainties everywhere in the problem, Larsen's group has also developed several methods to help distribute particles more evenly, based on forward estimates of flux. This paper focuses on the use of a forward estimate to weight the placement of the source in the adjoint calculation used by CADIS, which we refer to as a forward-weighted CADIS (FW-CADIS).« less

  5. Diagnosing Crime and Diagnosing Disease: Bias Reduction Strategies in the Forensic and Clinical Sciences.

    PubMed

    Lockhart, Joseph J; Satya-Murti, Saty

    2017-11-01

    Cognitive effort is an essential part of both forensic and clinical decision-making. Errors occur in both fields because the cognitive process is complex and prone to bias. We performed a selective review of full-text English language literature on cognitive bias leading to diagnostic and forensic errors. Earlier work (1970-2000) concentrated on classifying and raising bias awareness. Recently (2000-2016), the emphasis has shifted toward strategies for "debiasing." While the forensic sciences have focused on the control of misleading contextual cues, clinical debiasing efforts have relied on checklists and hypothetical scenarios. No single generally applicable and effective bias reduction strategy has emerged so far. Generalized attempts at bias elimination have not been particularly successful. It is time to shift focus to the study of errors within specific domains, and how to best communicate uncertainty in order to improve decision making on the part of both the expert and the trier-of-fact. © 2017 American Academy of Forensic Sciences.

  6. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  7. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  8. Developing Hydrogeological Site Characterization Strategies based on Human Health Risk

    NASA Astrophysics Data System (ADS)

    de Barros, F.; Rubin, Y.; Maxwell, R. M.

    2013-12-01

    In order to provide better sustainable groundwater quality management and minimize the impact of contamination in humans, improved understanding and quantification of the interaction between hydrogeological models, geological site information and human health are needed. Considering the joint influence of these components in the overall human health risk assessment and the corresponding sources of uncertainty aid decision makers to better allocate resources in data acquisition campaigns. This is important to (1) achieve remediation goals in a cost-effective manner, (2) protect human health and (3) keep water supplies clean in order to keep with quality standards. Such task is challenging since a full characterization of the subsurface is unfeasible due to financial and technological constraints. In addition, human exposure and physiological response to contamination are subject to uncertainty and variability. Normally, sampling strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on the overall system uncertainty. Therefore, quantifying the impact from each of these components (hydrogeological, behavioral and physiological) in final human health risk prediction can provide guidance for decision makers to best allocate resources towards minimal prediction uncertainty. In this presentation, a multi-component human health risk-based framework is presented which allows decision makers to set priorities through an information entropy-based visualization tool. Results highlight the role of characteristic length-scales characterizing flow and transport in determining data needs within an integrated hydrogeological-health framework. Conditions where uncertainty reduction in human health risk predictions may benefit from better understanding of the health component, as opposed to a more detailed hydrogeological characterization, are also discussed. Finally, results illustrate how different dose-response models can impact the probability of human health risk exceeding a regulatory threshold.

  9. A Comparative Study of Uncertainty Reduction Theory in High- and Low-Context Cultures.

    ERIC Educational Resources Information Center

    Kim, Myoung-Hye; Yoon, Tae-Jin

    To test the cross-cultural validity of uncertainty reduction theory, a study was conducted using students from South Korea and the United States who were chosen to represent high- and low-context cultures respectively. Uncertainty reduction theory is based upon the assumption that the primary concern of strangers upon meeting is one of uncertainty…

  10. Multi-objective Extremum Seeking Control for Enhancement of Wind Turbine Power Capture with Load Reduction

    NASA Astrophysics Data System (ADS)

    Xiao, Yan; Li, Yaoyu; Rotea, Mario A.

    2016-09-01

    The primary objective in below rated wind speed (Region 2) is to maximize the turbine's energy capture. Due to uncertainty, variability of turbine characteristics and lack of inexpensive but precise wind measurements, model-free control strategies that do not use wind measurements such as Extremum Seeking Control (ESC) have received significant attention. Based on a dither-demodulation scheme, ESC can maximize the wind power capture in real time despite uncertainty, variabilities and lack of accurate wind measurements. The existing work on ESC based wind turbine control focuses on power capture only. In this paper, a multi-objective extremum seeking control strategy is proposed to achieve nearly optimum wind energy capture while decreasing structural fatigue loads. The performance index of the ESC combines the rotor power and penalty terms of the standard deviations of selected fatigue load variables. Simulation studies of the proposed multi-objective ESC demonstrate that the damage-equivalent loads of tower and/or blade loads can be reduced with slight compromise in energy capture.

  11. Public perceptions of hurricane modification.

    PubMed

    Klima, Kelly; Bruine de Bruin, Wändi; Morgan, M Granger; Grossmann, Iris

    2012-07-01

    If hurricane modification were to become a feasible strategy for potentially reducing hurricane damages, it would likely generate public discourse about whether to support its implementation. To facilitate an informed and constructive discourse, policymakers need to understand how people perceive hurricane modification. Here, we examine Florida residents' perceptions of hurricane modification techniques that aim to alter path and wind speed. Following the mental models approach, we conducted a survey study about public perceptions of hurricane modification that was guided by formative interviews on the topic. We report a set of four primary findings. First, hurricane modification was perceived as a relatively ineffective strategy for damage reduction, compared to other strategies for damage reduction. Second, hurricane modification was expected to lead to changes in projected hurricane path, but not necessarily to the successful reduction of projected hurricane strength. Third, more anger was evoked when a hurricane was described as having changed from the initially forecasted path or strength after an attempted modification. Fourth, unlike what we expected, participants who more strongly agreed with statements that recognized the uncertainty inherent in forecasts reported more rather than less anger at scientists across hurricane modification scenarios. If the efficacy of intensity-reduction techniques can be increased, people may be willing to support hurricane modification. However, such an effort would need to be combined with open and honest communications to members of the general public. © 2011 Society for Risk Analysis.

  12. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    PubMed

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  13. Efficient experimental design for uncertainty reduction in gene regulatory networks.

    PubMed

    Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R

    2015-01-01

    An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.

  14. Efficient experimental design for uncertainty reduction in gene regulatory networks

    PubMed Central

    2015-01-01

    Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515

  15. The Cost-Effectiveness of Surgical Fixation of Distal Radial Fractures: A Computer Model-Based Evaluation of Three Operative Modalities.

    PubMed

    Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena

    2018-02-07

    There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.

  16. Integrating non-animal test information into an adaptive testing strategy - skin sensitization proof of concept case.

    PubMed

    Jaworska, Joanna; Harol, Artsiom; Kern, Petra S; Gerberick, G Frank

    2011-01-01

    There is an urgent need to develop data integration and testing strategy frameworks allowing interpretation of results from animal alternative test batteries. To this end, we developed a Bayesian Network Integrated Testing Strategy (BN ITS) with the goal to estimate skin sensitization hazard as a test case of previously developed concepts (Jaworska et al., 2010). The BN ITS combines in silico, in chemico, and in vitro data related to skin penetration, peptide reactivity, and dendritic cell activation, and guides testing strategy by Value of Information (VoI). The approach offers novel insights into testing strategies: there is no one best testing strategy, but the optimal sequence of tests depends on information at hand, and is chemical-specific. Thus, a single generic set of tests as a replacement strategy is unlikely to be most effective. BN ITS offers the possibility of evaluating the impact of generating additional data on the target information uncertainty reduction before testing is commenced.

  17. Novel health economic evaluation of a vaccination strategy to prevent HPV-related diseases: the BEST study.

    PubMed

    Favato, Giampiero; Baio, Gianluca; Capone, Alessandro; Marcellusi, Andrea; Costa, Silvano; Garganese, Giorgia; Picardo, Mauro; Drummond, Mike; Jonsson, Bengt; Scambia, Giovanni; Zweifel, Peter; Mennini, Francesco S

    2012-12-01

    The development of human papillomavirus (HPV)-related diseases is not understood perfectly and uncertainties associated with commonly utilized probabilistic models must be considered. The study assessed the cost-effectiveness of a quadrivalent-based multicohort HPV vaccination strategy within a Bayesian framework. A full Bayesian multicohort Markov model was used, in which all unknown quantities were associated with suitable probability distributions reflecting the state of currently available knowledge. These distributions were informed by observed data or expert opinion. The model cycle lasted 1 year, whereas the follow-up time horizon was 90 years. Precancerous cervical lesions, cervical cancers, and anogenital warts were considered as outcomes. The base case scenario (2 cohorts of girls aged 12 and 15 y) and other multicohort vaccination strategies (additional cohorts aged 18 and 25 y) were cost-effective, with a discounted cost per quality-adjusted life-year gained that corresponded to €12,013, €13,232, and €15,890 for vaccination programs based on 2, 3, and 4 cohorts, respectively. With multicohort vaccination strategies, the reduction in the number of HPV-related events occurred earlier (range, 3.8-6.4 y) when compared with a single cohort. The analysis of the expected value of information showed that the results of the model were subject to limited uncertainty (cost per patient = €12.6). This methodological approach is designed to incorporate the uncertainty associated with HPV vaccination. Modeling the cost-effectiveness of a multicohort vaccination program with Bayesian statistics confirmed the value for money of quadrivalent-based HPV vaccination. The expected value of information gave the most appropriate and feasible representation of the true value of this program.

  18. Impact of five tobacco endgame strategies on future smoking prevalence, population health and health system costs: two modelling studies to inform the tobacco endgame.

    PubMed

    van der Deen, Frederieke S; Wilson, Nick; Cleghorn, Christine L; Kvizhinadze, Giorgi; Cobiac, Linda J; Nghiem, Nhung; Blakely, Tony

    2018-05-01

    There is growing international interest in advancing 'the tobacco endgame'. We use New Zealand (Smokefree goal for 2025) as a case study to model the impacts on smoking prevalence (SP), health gains (quality-adjusted life-years (QALYs)) and cost savings of (1) 10% annual tobacco tax increases, (2) a tobacco-free generation (TFG), (3) a substantial outlet reduction strategy, (4) a sinking lid on tobacco supply and (5) a combination of 1, 2 and 3. Two models were used: (1) a dynamic population forecasting model for SP and (2) a closed cohort (population alive in 2011) multistate life table model (including 16 tobacco-related diseases) for health gains and costs. All selected tobacco endgame strategies were associated with reductions in SP by 2025, down from 34.7%/14.1% for Māori (indigenous population)/non-Māori in 2011 to 16.0%/6.8% for tax increases; 11.2%/5.6% for the TFG; 17.8%/7.3% for the outlet reduction; 0% for the sinking lid; and 9.3%/4.8% for the combined strategy. Major health gains accrued over the remainder of the 2011 population's lives ranging from 28 900 QALYs (95% Uncertainty Interval (UI)): 16 500 to 48 200; outlet reduction) to 282 000 QALYs (95%UI: 189 000 to 405 000; sinking lid) compared with business-as-usual (3% discounting). The timing of health gain and cost savings greatly differed for the various strategies (with accumulated health gain peaking in 2040 for the sinking lid and 2070 for the TFG). Implementing endgame strategies is needed to achieve tobacco endgame targets and reduce inequalities in smoking. Given such strategies are new, modelling studies provide provisional information on what approaches may be best. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Decentralized Control of Sound Radiation from an Aircraft-Style Panel Using Iterative Loop Recovery

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Cabell, Randolph H.; Fuller, Chris R.

    2008-01-01

    A decentralized LQG-based control strategy is designed to reduce low-frequency sound transmission through periodically stiffened panels. While modern control strategies have been used to reduce sound radiation from relatively simple structural acoustic systems, significant implementation issues have to be addressed before these control strategies can be extended to large systems such as the fuselage of an aircraft. For instance, centralized approaches typically require a high level of connectivity and are computationally intensive, while decentralized strategies face stability problems caused by the unmodeled interaction between neighboring control units. Since accurate uncertainty bounds are not known a priori, it is difficult to ensure the decentralized control system will be robust without making the controller overly conservative. Therefore an iterative approach is suggested, which utilizes frequency-shaped loop recovery. The approach accounts for modeling error introduced by neighboring control loops, requires no communication between subsystems, and is relatively simple. The control strategy is validated using real-time control experiments performed on a built-up aluminum test structure representative of the fuselage of an aircraft. Experiments demonstrate that the iterative approach is capable of achieving 12 dB peak reductions and a 3.6 dB integrated reduction in radiated sound power from the stiffened panel.

  20. Model-based adaptive sliding mode control of the subcritical boiler-turbine system with uncertainties.

    PubMed

    Tian, Zhen; Yuan, Jingqi; Xu, Liang; Zhang, Xiang; Wang, Jingcheng

    2018-05-25

    As higher requirements are proposed for the load regulation and efficiency enhancement, the control performance of boiler-turbine systems has become much more important. In this paper, a novel robust control approach is proposed to improve the coordinated control performance for subcritical boiler-turbine units. To capture the key features of the boiler-turbine system, a nonlinear control-oriented model is established and validated with the history operation data of a 300 MW unit. To achieve system linearization and decoupling, an adaptive feedback linearization strategy is proposed, which could asymptotically eliminate the linearization error caused by the model uncertainties. Based on the linearized boiler-turbine system, a second-order sliding mode controller is designed with the super-twisting algorithm. Moreover, the closed-loop system is proved robustly stable with respect to uncertainties and disturbances. Simulation results are presented to illustrate the effectiveness of the proposed control scheme, which achieves excellent tracking performance, strong robustness and chattering reduction. Copyright © 2018. Published by Elsevier Ltd.

  1. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  2. Expert elicitation survey on future wind energy costs

    DOE PAGES

    Wiser, Ryan; Jenni, Karen; Seel, Joachim; ...

    2016-09-12

    Wind energy supply has grown rapidly over the last decade. However, the long-term contribution of wind to future energy supply, and the degree to which policy support is necessary to motivate higher levels of deployment, depends - in part - on the future costs of both onshore and offshore wind. In this paper, we summarize the results of an expert elicitation survey of 163 of the world's foremost wind experts, aimed at better understanding future costs and technology advancement possibilities. Results suggest significant opportunities for cost reductions, but also underlying uncertainties. Under the median scenario, experts anticipate 24-30% reductions bymore » 2030 and 35-41% reductions by 2050 across the three wind applications studied. Costs could be even lower: experts predict a 10% chance that reductions will be more than 40% by 2030 and more than 50% by 2050. Insights gained through expert elicitation complement other tools for evaluating cost-reduction potential, and help inform policy and planning, R & D and industry strategy.« less

  3. Expert elicitation survey on future wind energy costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiser, Ryan; Jenni, Karen; Seel, Joachim

    Wind energy supply has grown rapidly over the last decade. However, the long-term contribution of wind to future energy supply, and the degree to which policy support is necessary to motivate higher levels of deployment, depends -- in part -- on the future costs of both onshore and offshore wind. Here, we summarize the results of an expert elicitation survey of 163 of the world's foremost wind experts, aimed at better understanding future costs and technology advancement possibilities. Results suggest significant opportunities for cost reductions, but also underlying uncertainties. Under the median scenario, experts anticipate 24-30% reductions by 2030 andmore » 35-41% reductions by 2050 across the three wind applications studied. Costs could be even lower: experts predict a 10% chance that reductions will be more than 40% by 2030 and more than 50% by 2050. Insights gained through expert elicitation complement other tools for evaluating cost-reduction potential, and help inform policy and planning, R&D and industry strategy.« less

  4. Expert elicitation survey on future wind energy costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiser, Ryan; Jenni, Karen; Seel, Joachim

    Wind energy supply has grown rapidly over the last decade. However, the long-term contribution of wind to future energy supply, and the degree to which policy support is necessary to motivate higher levels of deployment, depends - in part - on the future costs of both onshore and offshore wind. In this paper, we summarize the results of an expert elicitation survey of 163 of the world's foremost wind experts, aimed at better understanding future costs and technology advancement possibilities. Results suggest significant opportunities for cost reductions, but also underlying uncertainties. Under the median scenario, experts anticipate 24-30% reductions bymore » 2030 and 35-41% reductions by 2050 across the three wind applications studied. Costs could be even lower: experts predict a 10% chance that reductions will be more than 40% by 2030 and more than 50% by 2050. Insights gained through expert elicitation complement other tools for evaluating cost-reduction potential, and help inform policy and planning, R & D and industry strategy.« less

  5. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  6. Improving labeling efficiency in automatic quality control of MRSI data.

    PubMed

    Pedrosa de Barros, Nuno; McKinley, Richard; Wiest, Roland; Slotboom, Johannes

    2017-12-01

    To improve the efficiency of the labeling task in automatic quality control of MR spectroscopy imaging data. 28'432 short and long echo time (TE) spectra (1.5 tesla; point resolved spectroscopy (PRESS); repetition time (TR)= 1,500 ms) from 18 different brain tumor patients were labeled by two experts as either accept or reject, depending on their quality. For each spectrum, 47 signal features were extracted. The data was then used to run several simulations and test an active learning approach using uncertainty sampling. The performance of the classifiers was evaluated as a function of the number of patients in the training set, number of spectra in the training set, and a parameter α used to control the level of classification uncertainty required for a new spectrum to be selected for labeling. The results showed that the proposed strategy allows reductions of up to 72.97% for short TE and 62.09% for long TE in the amount of data that needs to be labeled, without significant impact in classification accuracy. Further reductions are possible with significant but minimal impact in performance. Active learning using uncertainty sampling is an effective way to increase the labeling efficiency for training automatic quality control classifiers. Magn Reson Med 78:2399-2405, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  7. The role of uncertainty and reward on eye movements in a virtual driving task

    PubMed Central

    Sullivan, Brian T.; Johnson, Leif; Rothkopf, Constantin A.; Ballard, Dana; Hayhoe, Mary

    2012-01-01

    Eye movements during natural tasks are well coordinated with ongoing task demands and many variables could influence gaze strategies. Sprague and Ballard (2003) proposed a gaze-scheduling model that uses a utility-weighted uncertainty metric to prioritize fixations on task-relevant objects and predicted that human gaze should be influenced by both reward structure and task-relevant uncertainties. To test this conjecture, we tracked the eye movements of participants in a simulated driving task where uncertainty and implicit reward (via task priority) were varied. Participants were instructed to simultaneously perform a Follow Task where they followed a lead car at a specific distance and a Speed Task where they drove at an exact speed. We varied implicit reward by instructing the participants to emphasize one task over the other and varied uncertainty in the Speed Task with the presence or absence of uniform noise added to the car's velocity. Subjects' gaze data were classified for the image content near fixation and segmented into looks. Gaze measures, including look proportion, duration and interlook interval, showed that drivers more closely monitor the speedometer if it had a high level of uncertainty, but only if it was also associated with high task priority or implicit reward. The interaction observed appears to be an example of a simple mechanism whereby the reduction of visual uncertainty is gated by behavioral relevance. This lends qualitative support for the primary variables controlling gaze allocation proposed in the Sprague and Ballard model. PMID:23262151

  8. Optimization Under Uncertainty for Wake Steering Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N.

    Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  9. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presencemore » of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  10. Optimization Under Uncertainty for Wake Steering Strategies

    NASA Astrophysics Data System (ADS)

    Quick, Julian; Annoni, Jennifer; King, Ryan; Dykes, Katherine; Fleming, Paul; Ning, Andrew

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as “wake steering,” in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  11. Optimization Under Uncertainty for Wake Steering Strategies

    DOE PAGES

    Quick, Julian; Annoni, Jennifer; King, Ryan N.; ...

    2017-06-13

    Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  12. Dynamic ambulance reallocation for the reduction of ambulance response times using system status management.

    PubMed

    Lam, Sean Shao Wei; Zhang, Ji; Zhang, Zhong Cheng; Oh, Hong Choon; Overton, Jerry; Ng, Yih Yng; Ong, Marcus Eng Hock

    2015-02-01

    Dynamically reassigning ambulance deployment locations throughout a day to balance ambulance availability and demands can be effective in reducing response times. The objectives of this study were to model dynamic ambulance allocation plans in Singapore based on the system status management (SSM) strategy and to evaluate the dynamic deployment plans using a discrete event simulation (DES) model. The geographical information system-based analysis and mathematical programming were used to develop the dynamic ambulance deployment plans for SSM based on ambulance calls data from January 1, 2011, to June 30, 2011. A DES model that incorporated these plans was used to compare the performance of the dynamic SSM strategy against static reallocation policies under various demands and travel time uncertainties. When the deployment plans based on the SSM strategy were followed strictly, the DES model showed that the geographical information system-based plans resulted in approximately 13-second reduction in the median response times compared to the static reallocation policy, whereas the mathematical programming-based plans resulted in approximately a 44-second reduction. The response times and coverage performances were still better than the static policy when reallocations happened for only 60% of all the recommended moves. Dynamically reassigning ambulance deployment locations based on the SSM strategy can result in superior response times and coverage performance compared to static reallocation policies even when the dynamic plans were not followed strictly. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Heng, E-mail: hengli@mdanderson.org; Zhu, X. Ronald; Zhang, Xiaodong

    Purpose: To develop and validate a novel delivery strategy for reducing the respiratory motion–induced dose uncertainty of spot-scanning proton therapy. Methods and Materials: The spot delivery sequence was optimized to reduce dose uncertainty. The effectiveness of the delivery sequence optimization was evaluated using measurements and patient simulation. One hundred ninety-one 2-dimensional measurements using different delivery sequences of a single-layer uniform pattern were obtained with a detector array on a 1-dimensional moving platform. Intensity modulated proton therapy plans were generated for 10 lung cancer patients, and dose uncertainties for different delivery sequences were evaluated by simulation. Results: Without delivery sequence optimization,more » the maximum absolute dose error can be up to 97.2% in a single measurement, whereas the optimized delivery sequence results in a maximum absolute dose error of ≤11.8%. In patient simulation, the optimized delivery sequence reduces the mean of fractional maximum absolute dose error compared with the regular delivery sequence by 3.3% to 10.6% (32.5-68.0% relative reduction) for different patients. Conclusions: Optimizing the delivery sequence can reduce dose uncertainty due to respiratory motion in spot-scanning proton therapy, assuming the 4-dimensional CT is a true representation of the patients' breathing patterns.« less

  14. Unrealized Global Temperature Increase: Implications of Current Uncertainties

    NASA Astrophysics Data System (ADS)

    Schwartz, Stephen E.

    2018-04-01

    Unrealized increase in global mean surface air temperature (GMST) may result from the climate system not being in steady state with forcings and/or from cessation of negative aerosol forcing that would result from decreases in emissions. An observation-constrained method is applied to infer the dependence of Earth's climate sensitivity on forcing by anthropogenic aerosols within the uncertainty on that forcing given by the Fifth (2013) Assessment Report of the Intergovernmental Panel on Climate Change. Within these uncertainty ranges the increase in GMST due to temperature lag for future forcings held constant is slight (0.09-0.19 K over 20 years; 0.12-0.26 K over 100 years). However, the incremental increase in GMST that would result from a hypothetical abrupt cessation of sources of aerosols could be quite large but is highly uncertain, 0.1-1.3 K over 20 years. Decrease in CO2 abundance and forcing following abrupt cessation of emissions would offset these increases in GMST over 100 years by as little as 0.09 K to as much as 0.8 K. The uncertainties quantified here greatly limit confidence in projections of change in GMST that would result from any strategy for future reduction of emissions.

  15. Impact of climate change on irrigation management for olive orchards at southern Spain

    NASA Astrophysics Data System (ADS)

    Lorite, Ignacio; Gabaldón-Leal, Clara; Santos, Cristina; Belaj, Angjelina; de la Rosa, Raul; Leon, Lorenzo; Ruiz-Ramos, Margarita

    2017-04-01

    The irrigation management for olive orchards under future weather conditions requires the development of advanced tools for considering specific physiological and phenological components affected by the foreseen changes in climate and atmospheric [CO2]. In this study a new simulation model named AdaptaOlive has been considered to develop controlled deficit irrigation and full irrigation scheduling for the traditional olive orchards located in Andalusia region (southern Spain) under the projected climate generated by an ensemble of 11 climate models from the ENSEMBLES European project corresponding to the SRES A1B scenario. Irrigation requirements, irrigation water productivity (IWP) and net margin (NM) were evaluated for three periods (baseline, near future and far future) and three irrigation strategies (rainfed, RF, controlled deficit irrigation, CDI, and full irrigation, FI). For irrigation requirements, a very limited average increase for far future compared with baseline period was found (2.6 and 1.3%, for CDI and FI, respectively). Equally, when IWP was analyzed, significant increases were identified for both irrigation strategies (77.4 and 72.2%, for CDI and FI, respectively) due to the high simulated increase in yield. Finally, when net margin was analyzed, the irrigation water cost had a key significance. For low water costs FI provided higher net margin values than for CDI. However, for high water costs (expected in the future due to the foreseen reduction in rainfall and the increase of the competence for the available water resources), net margin is reduced significantly, generating a very elevated number of years with negative net margin. All the described results are affected by a high level of uncertainty as the projections from the ensemble of 11 climate models show large spread. Thus, for a representative location within Andalusia region as Baeza, a reduction of irrigation requirements under full irrigation strategy was found for the ensemble mean (equal to 0.5%). However, when the individual projections from the 11 climate models were considered the variation of irrigation requirements for far future compared with baseline period ranged from increases of 8.5% to reductions of 10.7%. This fact demonstrates the necessity to consider ensembles of climate models for identifying averaged impacts and the range of variability of these impacts, quantifying the uncertainty in the estimates related with water management in the future. The study concludes that the promotion of controlled deficit irrigation strategies is an excellent adaptation strategy. However, this strategy must be supported with the enhance of farmers' training by the implementation of local or regional irrigation advisory services.

  16. Multiple effects and uncertainties of emission control policies in China: Implications for public health, soil acidification, and global temperature.

    PubMed

    Zhao, Yu; McElroy, Michael B; Xing, Jia; Duan, Lei; Nielsen, Chris P; Lei, Yu; Hao, Jiming

    2011-11-15

    Policies to control emissions of criteria pollutants in China may have conflicting impacts on public health, soil acidification, and climate. Two scenarios for 2020, a base case without anticipated control measures and a more realistic case including such controls, are evaluated to quantify the effects of the policies on emissions and resulting environmental outcomes. Large benefits to public health can be expected from the controls, attributed mainly to reduced emissions of primary PM and gaseous PM precursors, and thus lower ambient concentrations of PM2.5. Approximately 4% of all-cause mortality in the country can be avoided (95% confidence interval: 1-7%), particularly in eastern and north-central China, regions with large population densities and high levels of PM2.5. Surface ozone levels, however, are estimated to increase in parts of those regions, despite NOX reductions. This implies VOC-limited conditions. Even with significant reduction of SO2 and NOX emissions, the controls will not significantly mitigate risks of soil acidification, judged by the exceedance levels of critical load (CL). This is due to the decrease in primary PM emissions, with the consequent reduction in deposition of alkaline base cations. Compared to 2005, even larger CL exceedances are found for both scenarios in 2020, implying that PM control may negate any recovery from soil acidification due to SO2 reductions. Noting large uncertainties, current polices to control emissions of criteria pollutants in China will not reduce climate warming, since controlling SO2 emissions also reduces reflective secondary aerosols. Black carbon emission is an important source of uncertainty concerning the effects of Chinese control policies on global temperature change. Given these conflicts, greater consideration should be paid to reconciling varied environmental objectives, and emission control strategies should target not only criteria pollutants but also species such as VOCs and CO2. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Risk-based prioritization among air pollution control strategies in the Yangtze River Delta, China.

    PubMed

    Zhou, Ying; Fu, Joshua S; Zhuang, Guoshun; Levy, Jonathan I

    2010-09-01

    The Yangtze River Delta (YRD) in China is a densely populated region with recent dramatic increases in energy consumption and atmospheric emissions. We studied how different emission sectors influence population exposures and the corresponding health risks, to inform air pollution control strategy design. We applied the Community Multiscale Air Quality (CMAQ) Modeling System to model the marginal contribution to baseline concentrations from different sectors. We focused on nitrogen oxide (NOx) control while considering other pollutants that affect fine particulate matter [aerodynamic diameter < or = 2.5 mum (PM2.5)] and ozone concentrations. We developed concentration-response (C-R) functions for PM2.5 and ozone mortality for China to evaluate the anticipated health benefits. In the YRD, health benefits per ton of emission reductions varied significantly across pollutants, with reductions of primary PM2.5 from the industry sector and mobile sources showing the greatest benefits of 0.1 fewer deaths per year per ton of emission reduction. Combining estimates of health benefits per ton with potential emission reductions, the greatest mortality reduction of 12,000 fewer deaths per year [95% confidence interval (CI), 1,200-24,000] was associated with controlling primary PM2.5 emissions from the industry sector and reducing sulfur dioxide (SO2) from the power sector, respectively. Benefits were lower for reducing NOx emissions given lower consequent reductions in the formation of secondary PM2.5 (compared with SO2) and increases in ozone concentrations that would result in the YRD. Although uncertainties related to C-R functions are significant, the estimated health benefits of emission reductions in the YRD are substantial, especially for sectors and pollutants with both higher health benefits per unit emission reductions and large potential for emission reductions.

  18. Projected Impact of Salt Restriction on Prevention of Cardiovascular Disease in China: A Modeling Study

    PubMed Central

    Liu, Jing; Coxson, Pamela G.; Penko, Joanne; Goldman, Lee; Bibbins-Domingo, Kirsten; Zhao, Dong

    2016-01-01

    Objectives To estimate the effects of achieving China’s national goals for dietary salt (NaCl) reduction or implementing culturally-tailored dietary salt restriction strategies on cardiovascular disease (CVD) prevention. Methods The CVD Policy Model was used to project blood pressure lowering and subsequent downstream prevented CVD that could be achieved by population-wide salt restriction in China. Outcomes were annual CVD events prevented, relative reductions in rates of CVD incidence and mortality, quality-adjusted life-years (QALYs) gained, and CVD treatment costs saved. Results Reducing mean dietary salt intake to 9.0 g/day gradually over 10 years could prevent approximately 197 000 incident annual CVD events [95% uncertainty interval (UI): 173 000–219 000], reduce annual CVD mortality by approximately 2.5% (2.2–2.8%), gain 303 000 annual QALYs (278 000–329 000), and save approximately 1.4 billion international dollars (Int$) in annual CVD costs (Int$; 1.2–1.6 billion). Reducing mean salt intake to 6.0 g/day could approximately double these benefits. Implementing cooking salt-restriction spoons could prevent 183 000 fewer incident CVD cases (153 000–215 000) and avoid Int$1.4 billion in CVD treatment costs annually (1.2–1.7 billion). Implementing a cooking salt substitute strategy could lead to approximately three times the health benefits of the salt-restriction spoon program. More than three-quarters of benefits from any dietary salt reduction strategy would be realized in hypertensive adults. Conclusion China could derive substantial health gains from implementation of population-wide dietary salt reduction policies. Most health benefits from any dietary salt reduction program would be realized in adults with hypertension. PMID:26840409

  19. Attentional Mechanisms in Simple Visual Detection: A Speed-Accuracy Trade-Off Analysis

    ERIC Educational Resources Information Center

    Liu, Charles C.; Wolfgang, Bradley J.; Smith, Philip L.

    2009-01-01

    Recent spatial cuing studies have shown that detection sensitivity can be increased by the allocation of attention. This increase has been attributed to one of two mechanisms: signal enhancement or uncertainty reduction. Signal enhancement is an increase in the signal-to-noise ratio at the cued location; uncertainty reduction is a reduction in the…

  20. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    NASA Astrophysics Data System (ADS)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.

  1. The effectiveness of group-based comprehensive risk-reduction and abstinence education interventions to prevent or reduce the risk of adolescent pregnancy, human immunodeficiency virus, and sexually transmitted infections: two systematic reviews for the Guide to Community Preventive Services.

    PubMed

    Chin, Helen B; Sipe, Theresa Ann; Elder, Randy; Mercer, Shawna L; Chattopadhyay, Sajal K; Jacob, Verughese; Wethington, Holly R; Kirby, Doug; Elliston, Donna B; Griffith, Matt; Chuke, Stella O; Briss, Susan C; Ericksen, Irene; Galbraith, Jennifer S; Herbst, Jeffrey H; Johnson, Robert L; Kraft, Joan M; Noar, Seth M; Romero, Lisa M; Santelli, John

    2012-03-01

    Adolescent pregnancy, HIV, and other sexually transmitted infections (STIs) are major public health problems in the U.S. Implementing group-based interventions that address the sexual behavior of adolescents may reduce the incidence of pregnancy, HIV, and other STIs in this group. Methods for conducting systematic reviews from the Guide to Community Preventive Services were used to synthesize scientific evidence on the effectiveness of two strategies for group-based behavioral interventions for adolescents: (1) comprehensive risk reduction and (2) abstinence education on preventing pregnancy, HIV, and other STIs. Effectiveness of these interventions was determined by reductions in sexual risk behaviors, pregnancy, HIV, and other STIs and increases in protective sexual behaviors. The literature search identified 6579 citations for comprehensive risk reduction and abstinence education. Of these, 66 studies of comprehensive risk reduction and 23 studies of abstinence education assessed the effects of group-based interventions that address the sexual behavior of adolescents, and were included in the respective reviews. Meta-analyses were conducted for each strategy on the seven key outcomes identified by the coordination team-current sexual activity; frequency of sexual activity; number of sex partners; frequency of unprotected sexual activity; use of protection (condoms and/or hormonal contraception); pregnancy; and STIs. The results of these meta-analyses for comprehensive risk reduction showed favorable effects for all of the outcomes reviewed. For abstinence education, the meta-analysis showed a small number of studies, with inconsistent findings across studies that varied by study design and follow-up time, leading to considerable uncertainty around effect estimates. Based on these findings, group-based comprehensive risk reduction was found to be an effective strategy to reduce adolescent pregnancy, HIV, and STIs. No conclusions could be drawn on the effectiveness of group-based abstinence education. Published by Elsevier Inc.

  2. An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation

    NASA Astrophysics Data System (ADS)

    McLean, N. M.; Bowring, J. F.; Bowring, S. A.

    2011-06-01

    High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of calculated U-Pb dates. As analytical techniques have advanced, formerly small sources of uncertainty are increasingly important, and thus previous simplifications for data reduction and uncertainty propagation are no longer valid. Although notable previous efforts have treated propagation of correlated uncertainties for the U-Pb system, the equations, uncertainties, and correlations have been limited in number and subject to simplification during propagation through intermediary calculations. We derive and present a transparent U-Pb data reduction algorithm that transforms raw isotopic data and measured or assumed laboratory parameters into the isotopic ratios and dates geochronologists interpret without making assumptions about the relative size of sample components. To propagate uncertainties and their correlations, we describe, in detail, a linear algebraic algorithm that incorporates all input uncertainties and correlations without limiting or simplifying covariance terms to propagate them though intermediate calculations. Finally, a weighted mean algorithm is presented that utilizes matrix elements from the uncertainty propagation algorithm to propagate random and systematic uncertainties for data comparison between other U-Pb labs and other geochronometers. The linear uncertainty propagation algorithms are verified with Monte Carlo simulations of several typical analyses. We propose that our algorithms be considered by the community for implementation to improve the collaborative science envisioned by the EARTHTIME initiative.

  3. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  4. Estimating statistical uncertainty of Monte Carlo efficiency-gain in the context of a correlated sampling Monte Carlo code for brachytherapy treatment planning with non-normal dose distribution.

    PubMed

    Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr

    2012-01-01

    Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Impact of hydrogeological data on measures of uncertainty, site characterization and environmental performance metrics

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Ezzedine, Souheil; Rubin, Yoram

    2012-02-01

    The significance of conditioning predictions of environmental performance metrics (EPMs) on hydrogeological data in heterogeneous porous media is addressed. Conditioning EPMs on available data reduces uncertainty and increases the reliability of model predictions. We present a rational and concise approach to investigate the impact of conditioning EPMs on data as a function of the location of the environmentally sensitive target receptor, data types and spacing between measurements. We illustrate how the concept of comparative information yield curves introduced in de Barros et al. [de Barros FPJ, Rubin Y, Maxwell R. The concept of comparative information yield curves and its application to risk-based site characterization. Water Resour Res 2009;45:W06401. doi:10.1029/2008WR007324] could be used to assess site characterization needs as a function of flow and transport dimensionality and EPMs. For a given EPM, we show how alternative uncertainty reduction metrics yield distinct gains of information from a variety of sampling schemes. Our results show that uncertainty reduction is EPM dependent (e.g., travel times) and does not necessarily indicate uncertainty reduction in an alternative EPM (e.g., human health risk). The results show how the position of the environmental target, flow dimensionality and the choice of the uncertainty reduction metric can be used to assist in field sampling campaigns.

  6. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    PubMed

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  7. How Confident can we be in Flood Risk Assessments?

    NASA Astrophysics Data System (ADS)

    Merz, B.

    2017-12-01

    Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.

  8. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  9. Location and Modality Effects in Online Dating: Rich Modality Profile and Location-Based Information Cues Increase Social Presence, While Moderating the Impact of Uncertainty Reduction Strategy.

    PubMed

    Jung, Soyoung; Roh, Soojin; Yang, Hyun; Biocca, Frank

    2017-09-01

    This study investigates how different interface modality features of online dating sites, such as location awareness cues and modality of profiles, affect the sense of social presence of a prospective date. We also examined how various user behaviors aimed at reducing uncertainty about online interactions affect social presence perceptions and are affected by the user interface features. Male users felt a greater sense of social presence when exposed to both location and accessibility cues (geographical proximity) and a richer medium (video profiles). Viewing a richer medium significantly increased the sense of social presence among female participants whereas location-based information sharing features did not directly affect their social presence perception. Augmented social presence, as a mediator, contributed to users' greater intention to meet potential dating partners in a face-to-face setting and to buy paid memberships on online dating sites.

  10. Comparison among different retrofitting strategies for the vulnerability reduction of masonry bell towers

    NASA Astrophysics Data System (ADS)

    Milani, Gabriele; Shehu, Rafael; Valente, Marco

    2017-11-01

    This paper investigates the effectiveness of reducing the seismic vulnerability of masonry towers by means of innovative and traditional strengthening techniques. The followed strategy for providing the optimal retrofitting for masonry towers subjected to seismic risk relies on preventing active failure mechanisms. These vulnerable mechanisms are pre-assigned failure patterns based on the crack patterns experienced during the past seismic events. An upper bound limit analysis strategy is found suitable to be applied for simplified tower models in their present state and the proposed retrofitted ones. Taking into consideration the variability of geometrical features and the uncertainty of the strengthening techniques, Monte Carlo simulations are implemented into the limit analysis. In this framework a wide range of idealized cases are covered by the conducted analyses. The retrofitting strategies aim to increase the shear strength and the overturning load carrying capacity in order to reduce vulnerability. This methodology gives the possibility to use different materials which can fulfill the structural implementability requirements.

  11. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  12. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  13. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  14. Unrealized Global Temperature Increase: Implications of Current Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Stephen E.

    Unrealized increase in global mean surface air temperature (GMST) may result from the climate system not being in steady state with forcings and/or from cessation of negative aerosol forcing that would result from decreases in emissions. An observation-constrained method is applied to infer the dependence of Earth's climate sensitivity on forcing by anthropogenic aerosols within the uncertainty on that forcing given by the Fifth (2013) Assessment Report of the Intergovernmental Panel on Climate Change. Within these uncertainty ranges the increase in GMST due to temperature lag for future forcings held constant is slight (0.09–0.19 K over 20 years; 0.12–0.26 Kmore » over 100 years). However the incremental increase in GMST that would result from a hypothetical abrupt cessation of sources of aerosols could be quite large, but is highly uncertain, 0.1–1.3 K over 20 years. Decrease in CO2 abundance and forcing following abrupt cessation of emissions would offset these increases in GMST over 100 years by as little as 0.09 K to as much as 0.8 K. The uncertainties quantified here greatly limit confidence in projections of change in GMST that would result from any strategy for future reduction of emissions.« less

  15. Unrealized Global Temperature Increase: Implications of Current Uncertainties

    DOE PAGES

    Schwartz, Stephen E.

    2018-03-07

    Unrealized increase in global mean surface air temperature (GMST) may result from the climate system not being in steady state with forcings and/or from cessation of negative aerosol forcing that would result from decreases in emissions. An observation-constrained method is applied to infer the dependence of Earth's climate sensitivity on forcing by anthropogenic aerosols within the uncertainty on that forcing given by the Fifth (2013) Assessment Report of the Intergovernmental Panel on Climate Change. Within these uncertainty ranges the increase in GMST due to temperature lag for future forcings held constant is slight (0.09–0.19 K over 20 years; 0.12–0.26 Kmore » over 100 years). However the incremental increase in GMST that would result from a hypothetical abrupt cessation of sources of aerosols could be quite large, but is highly uncertain, 0.1–1.3 K over 20 years. Decrease in CO2 abundance and forcing following abrupt cessation of emissions would offset these increases in GMST over 100 years by as little as 0.09 K to as much as 0.8 K. The uncertainties quantified here greatly limit confidence in projections of change in GMST that would result from any strategy for future reduction of emissions.« less

  16. Development of perspective-based water management strategies for the Rhine and Meuse basins.

    PubMed

    van Deursen, W P A; Middelkoop, H

    2005-01-01

    Water management is surrounded by uncertainties. Water management thus has to answer the question: given the uncertainties, what is the best management strategy? This paper describes the application of the perspectives method on water management in the Rhine and Meuse basins. In the perspectives method, a structured framework to analyse water management strategies under uncertainty is provided. Various strategies are clustered in perspectives according to their underlying assumptions. This framework allows for an analysis of current water management strategies, but also allows for evaluation of the robustness of proposed future water strategies. It becomes clear that no water management strategy is superior to the others, but that inherent choices on risk acceptance and costs make a real political dilemma which will not be solved by further optimisation.

  17. The Importance of Implementation Strategy in Scaling Up Xpert MTB/RIF for Diagnosis of Tuberculosis in the Indian Health-Care System: A Transmission Model

    PubMed Central

    Salje, Henrik; Andrews, Jason R.; Deo, Sarang; Satyanarayana, Srinath; Sun, Amanda Y.; Pai, Madhukar; Dowdy, David W.

    2014-01-01

    Background India has announced a goal of universal access to quality tuberculosis (TB) diagnosis and treatment. A number of novel diagnostics could help meet this important goal. The rollout of one such diagnostic, Xpert MTB/RIF (Xpert) is being considered, but if Xpert is used mainly for people with HIV or high risk of multidrug-resistant TB (MDR-TB) in the public sector, population-level impact may be limited. Methods and Findings We developed a model of TB transmission, care-seeking behavior, and diagnostic/treatment practices in India and explored the impact of six different rollout strategies. Providing Xpert to 40% of public-sector patients with HIV or prior TB treatment (similar to current national strategy) reduced TB incidence by 0.2% (95% uncertainty range [UR]: −1.4%, 1.7%) and MDR-TB incidence by 2.4% (95% UR: −5.2%, 9.1%) relative to existing practice but required 2,500 additional MDR-TB treatments and 60 four-module GeneXpert systems at maximum capacity. Further including 20% of unselected symptomatic individuals in the public sector required 700 systems and reduced incidence by 2.1% (95% UR: 0.5%, 3.9%); a similar approach involving qualified private providers (providers who have received at least some training in allopathic or non-allopathic medicine) reduced incidence by 6.0% (95% UR: 3.9%, 7.9%) with similar resource outlay, but only if high treatment success was assured. Engaging 20% of all private-sector providers (qualified and informal [providers with no formal medical training]) had the greatest impact (14.1% reduction, 95% UR: 10.6%, 16.9%), but required >2,200 systems and reliable treatment referral. Improving referrals from informal providers for smear-based diagnosis in the public sector (without Xpert rollout) had substantially greater impact (6.3% reduction) than Xpert scale-up within the public sector. These findings are subject to substantial uncertainty regarding private-sector treatment patterns, patient care-seeking behavior, symptoms, and infectiousness over time; these uncertainties should be addressed by future research. Conclusions The impact of new diagnostics for TB control in India depends on implementation within the complex, fragmented health-care system. Transformative strategies will require private/informal-sector engagement, adequate referral systems, improved treatment quality, and substantial resources. Please see later in the article for the Editors' Summary PMID:25025235

  18. Uncertainty in life cycle greenhouse gas emissions from United States natural gas end-uses and its effects on policy.

    PubMed

    Venkatesh, Aranya; Jaramillo, Paulina; Griffin, W Michael; Matthews, H Scott

    2011-10-01

    Increasing concerns about greenhouse gas (GHG) emissions in the United States have spurred interest in alternate low carbon fuel sources, such as natural gas. Life cycle assessment (LCA) methods can be used to estimate potential emissions reductions through the use of such fuels. Some recent policies have used the results of LCAs to encourage the use of low carbon fuels to meet future energy demands in the U.S., without, however, acknowledging and addressing the uncertainty and variability prevalent in LCA. Natural gas is a particularly interesting fuel since it can be used to meet various energy demands, for example, as a transportation fuel or in power generation. Estimating the magnitudes and likelihoods of achieving emissions reductions from competing end-uses of natural gas using LCA offers one way to examine optimal strategies of natural gas resource allocation, given that its availability is likely to be limited in the future. In this study, the uncertainty in life cycle GHG emissions of natural gas (domestic and imported) consumed in the U.S. was estimated using probabilistic modeling methods. Monte Carlo simulations are performed to obtain sample distributions representing life cycle GHG emissions from the use of 1 MJ of domestic natural gas and imported LNG. Life cycle GHG emissions per energy unit of average natural gas consumed in the U.S were found to range between -8 and 9% of the mean value of 66 g CO(2)e/MJ. The probabilities of achieving emissions reductions by using natural gas for transportation and power generation, as a substitute for incumbent fuels such as gasoline, diesel, and coal were estimated. The use of natural gas for power generation instead of coal was found to have the highest and most likely emissions reductions (almost a 100% probability of achieving reductions of 60 g CO(2)e/MJ of natural gas used), while there is a 10-35% probability of the emissions from natural gas being higher than the incumbent if it were used as a transportation fuel. This likelihood of an increase in GHG emissions is indicative of the potential failure of a climate policy targeting reductions in GHG emissions.

  19. Relationships among trust in messages, risk perception, and risk reduction preferences based upon avian influenza in Taiwan.

    PubMed

    Fang, David; Fang, Chen-Ling; Tsai, Bi-Kun; Lan, Li-Chi; Hsu, Wen-Shan

    2012-08-01

    Improvements in communications technology enable consumers to receive information through diverse channels. In the case of avian influenza, information repeated by the mass media socially amplifies the consumer awareness of risks. Facing indeterminate risks, consumers may feel anxious and increase their risk perception. When consumers trust the information published by the media, their uncertainty toward avian influenza may decrease. Consumers might take some actions to reduce risk. Therefore, this study focuses on relationships among trust in messages, risk perception and risk reduction preferences. This study administered 525 random samples and consumer survey questionnaires in different city of Taiwan in 2007. Through statistical analysis, the results demonstrate: (1) the higher the trust consumers have in messages about avian influenza, the lower their risk perceptions are; (2) the higher the consumers' risk perceptions are and, therefore, the higher their desired level of risk reductive, the more likely they are to accept risk reduction strategies; (3) consumer attributes such as age, education level, and marital status correlate with significant differences in risk perception and risk reduction preferences acceptance. Gender has significant differences only in risk reduction preferences and not in risk perception.

  20. Relationships among Trust in Messages, Risk Perception, and Risk Reduction Preferences Based upon Avian Influenza in Taiwan

    PubMed Central

    Fang, David; Fang, Chen-Ling; Tsai, Bi-Kun; Lan, Li-Chi; Hsu, Wen-Shan

    2012-01-01

    Improvements in communications technology enable consumers to receive information through diverse channels. In the case of avian influenza, information repeated by the mass media socially amplifies the consumer awareness of risks. Facing indeterminate risks, consumers may feel anxious and increase their risk perception. When consumers trust the information published by the media, their uncertainty toward avian influenza may decrease. Consumers might take some actions to reduce risk. Therefore, this study focuses on relationships among trust in messages, risk perception and risk reduction preferences. This study administered 525 random samples and consumer survey questionnaires in different city of Taiwan in 2007. Through statistical analysis, the results demonstrate: (1) the higher the trust consumers have in messages about avian influenza, the lower their risk perceptions are; (2) the higher the consumers’ risk perceptions are and, therefore, the higher their desired level of risk reductive, the more likely they are to accept risk reduction strategies; (3) consumer attributes such as age, education level, and marital status correlate with significant differences in risk perception and risk reduction preferences acceptance. Gender has significant differences only in risk reduction preferences and not in risk perception. PMID:23066394

  1. Security under Uncertainty: Adaptive Attackers Are More Challenging to Human Defenders than Random Attackers

    PubMed Central

    Moisan, Frédéric; Gonzalez, Cleotilde

    2017-01-01

    Game Theory is a common approach used to understand attacker and defender motives, strategies, and allocation of limited security resources. For example, many defense algorithms are based on game-theoretic solutions that conclude that randomization of defense actions assures unpredictability, creating difficulties for a human attacker. However, many game-theoretic solutions often rely on idealized assumptions of decision making that underplay the role of human cognition and information uncertainty. The consequence is that we know little about how effective these algorithms are against human players. Using a simplified security game, we study the type of attack strategy and the uncertainty about an attacker's strategy in a laboratory experiment where participants play the role of defenders against a simulated attacker. Our goal is to compare a human defender's behavior in three levels of uncertainty (Information Level: Certain, Risky, Uncertain) and three types of attacker's strategy (Attacker's strategy: Minimax, Random, Adaptive) in a between-subjects experimental design. Best defense performance is achieved when defenders play against a minimax and a random attack strategy compared to an adaptive strategy. Furthermore, when payoffs are certain, defenders are as efficient against random attack strategy as they are against an adaptive strategy, but when payoffs are uncertain, defenders have most difficulties defending against an adaptive attacker compared to a random attacker. We conclude that given conditions of uncertainty in many security problems, defense algorithms would be more efficient if they are adaptive to the attacker actions, taking advantage of the attacker's human inefficiencies. PMID:28690557

  2. A Theory of Perceptual Learning: Uncertainty Reduction and Reading.

    ERIC Educational Resources Information Center

    Henk, William A.

    Behaviorism cannot adequately explain language processing. A synthesis of the psycholinguistic and information processing approaches of cognitive psychology, however, can provide the basis for a speculative analysis of reading, if this synthesis is tempered by a perceptual learning theory of uncertainty reduction. Theorists of information…

  3. Perceptions of Electronic Cigarettes Among Medicaid-Eligible Pregnant and Postpartum Women.

    PubMed

    Fallin, Amanda; Miller, Alana; Assef, Sara; Ashford, Kristin

    2016-01-01

    To describe perceptions and beliefs about electronic cigarette (e-cigarette) use during pregnancy among pregnant and newly postpartum women. An exploratory, qualitative descriptive study. University-affiliated prenatal clinics. Twelve pregnant or recently postpartum women who reported use of tobacco and electronic cigarettes. Semistructured focus groups were audio recorded and professionally transcribed. The transcripts were coded to consensus and analyzed with MAXQDA software (version 11) using content analysis. Four overarching themes emerged: (a) Attraction to E-Cigarettes as a Harm Reduction Strategy, (b) Uncertainty Regarding the Health Effects of E-Cigarettes; (c) Ambivalence Regarding Novel Product Characteristics; and (d) Behaviors Reflected Dual Use and Often Complete Relapse to Traditional Cigarettes. Pregnant women are initially attracted to e-cigarettes as a harm reduction strategy, but they often return to traditional cigarettes in the postpartum period. Nurses should counsel pregnant women on the adverse effects of fetal exposure to nicotine. Evidence-based nursing interventions are needed to prevent relapse during the postpartum period. Copyright © 2016 AWHONN, the Association of Women’s Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  4. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less

  5. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    NASA Astrophysics Data System (ADS)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.

  6. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    USGS Publications Warehouse

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to evaluate adaptive management performance and value of learning. Although natural resource decisions are characterized by uncertainty, not all uncertainty will cause decisions to be altered substantially, as we found in this case. It is important to incorporate uncertainty into the decision framing and evaluate the effect of reducing that uncertainty on achieving the desired outcomes

  7. Dealing with deep uncertainties in landslide modelling for disaster risk reduction under climate change

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Elizabeth Ann; Pianosi, Francesca; Wagener, Thorsten

    2017-02-01

    Landslides have large negative economic and societal impacts, including loss of life and damage to infrastructure. Slope stability assessment is a vital tool for landslide risk management, but high levels of uncertainty often challenge its usefulness. Uncertainties are associated with the numerical model used to assess slope stability and its parameters, with the data characterizing the geometric, geotechnic and hydrologic properties of the slope, and with hazard triggers (e.g. rainfall). Uncertainties associated with many of these factors are also likely to be exacerbated further by future climatic and socio-economic changes, such as increased urbanization and resultant land use change. In this study, we illustrate how numerical models can be used to explore the uncertain factors that influence potential future landslide hazard using a bottom-up strategy. Specifically, we link the Combined Hydrology And Stability Model (CHASM) with sensitivity analysis and Classification And Regression Trees (CART) to identify critical thresholds in slope properties and climatic (rainfall) drivers that lead to slope failure. We apply our approach to a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. For this particular slope, we find that uncertainties regarding some slope properties (namely thickness and effective cohesion of topsoil) are as important as the uncertainties related to future rainfall conditions. Furthermore, we show that 89 % of the expected behaviour of the studied slope can be characterized based on only two variables - the ratio of topsoil thickness to cohesion and the ratio of rainfall intensity to duration.

  8. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  9. The Relationship of Cultural Similarity, Communication Effectiveness and Uncertainty Reduction.

    ERIC Educational Resources Information Center

    Koester, Jolene; Olebe, Margaret

    To investigate the relationship of cultural similarity/dissimilarity, communication effectiveness, and communication variables associated with uncertainty reduction theory, a study examined two groups of students--a multinational group living on an "international floor" in a dormitory at a state university and an unrelated group of U.S.…

  10. Policy implications of uncertainty in modeled life-cycle greenhouse gas emissions of biofuels.

    PubMed

    Mullins, Kimberley A; Griffin, W Michael; Matthews, H Scott

    2011-01-01

    Biofuels have received legislative support recently in California's Low-Carbon Fuel Standard and the Federal Energy Independence and Security Act. Both present new fuel types, but neither provides methodological guidelines for dealing with the inherent uncertainty in evaluating their potential life-cycle greenhouse gas emissions. Emissions reductions are based on point estimates only. This work demonstrates the use of Monte Carlo simulation to estimate life-cycle emissions distributions from ethanol and butanol from corn or switchgrass. Life-cycle emissions distributions for each feedstock and fuel pairing modeled span an order of magnitude or more. Using a streamlined life-cycle assessment, corn ethanol emissions range from 50 to 250 g CO(2)e/MJ, for example, and each feedstock-fuel pathway studied shows some probability of greater emissions than a distribution for gasoline. Potential GHG emissions reductions from displacing fossil fuels with biofuels are difficult to forecast given this high degree of uncertainty in life-cycle emissions. This uncertainty is driven by the importance and uncertainty of indirect land use change emissions. Incorporating uncertainty in the decision making process can illuminate the risks of policy failure (e.g., increased emissions), and a calculated risk of failure due to uncertainty can be used to inform more appropriate reduction targets in future biofuel policies.

  11. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  12. Integration of the Uncertainties of Anion and TOC Measurements into the Flammability Control Strategy for Sludge Batch 8 at the DWPF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T. B.

    2013-03-14

    The Savannah River National Laboratory (SRNL) has been working with the Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) in the development and implementation of a flammability control strategy for DWPF’s melter operation during the processing of Sludge Batch 8 (SB8). SRNL’s support has been in response to technical task requests that have been made by SRR’s Waste Solidification Engineering (WSE) organization. The flammability control strategy relies on measurements that are performed on Slurry Mix Evaporator (SME) samples by the DWPF Laboratory. Measurements of nitrate, oxalate, formate, and total organic carbon (TOC) standards generated by the DWPF Laboratory aremore » presented in this report, and an evaluation of the uncertainties of these measurements is provided. The impact of the uncertainties of these measurements on DWPF’s strategy for controlling melter flammability also is evaluated. The strategy includes monitoring each SME batch for its nitrate content and its TOC content relative to the nitrate content and relative to the antifoam additions made during the preparation of the SME batch. A linearized approach for monitoring the relationship between TOC and nitrate is developed, equations are provided that integrate the measurement uncertainties into the flammability control strategy, and sample calculations for these equations are shown to illustrate the impact of the uncertainties on the flammability control strategy.« less

  13. Coordinate metrology using scanning probe microscopes

    NASA Astrophysics Data System (ADS)

    Marinello, F.; Savio, E.; Bariani, P.; Carmignato, S.

    2009-08-01

    New positioning, probing and measuring strategies in coordinate metrology are needed for the accomplishment of true three-dimensional characterization of microstructures, with uncertainties in the nanometre range. In the present work, the implementation of scanning probe microscopes (SPMs) as systems for coordinate metrology is discussed. A new non-raster measurement approach is proposed, where the probe is moved to sense points along free paths on the sample surface, with no loss of accuracy with respect to traditional raster scanning and scan time reduction. Furthermore, new probes featuring long tips with innovative geometries suitable for coordinate metrology through SPMs are examined and reported.

  14. Optimisation of GaN LEDs and the reduction of efficiency droop using active machine learning

    DOE PAGES

    Rouet-Leduc, Bertrand; Barros, Kipton Marcos; Lookman, Turab; ...

    2016-04-26

    A fundamental challenge in the design of LEDs is to maximise electro-luminescence efficiency at high current densities. We simulate GaN-based LED structures that delay the onset of efficiency droop by spreading carrier concentrations evenly across the active region. Statistical analysis and machine learning effectively guide the selection of the next LED structure to be examined based upon its expected efficiency as well as model uncertainty. This active learning strategy rapidly constructs a model that predicts Poisson-Schrödinger simulations of devices, and that simultaneously produces structures with higher simulated efficiencies.

  15. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.

  16. Do sophisticated epistemic beliefs predict meaningful learning? Findings from a structural equation model of undergraduate biology learning

    NASA Astrophysics Data System (ADS)

    Lee, Silvia Wen-Yu; Liang, Jyh-Chong; Tsai, Chin-Chung

    2016-10-01

    This study investigated the relationships among college students' epistemic beliefs in biology (EBB), conceptions of learning biology (COLB), and strategies of learning biology (SLB). EBB includes four dimensions, namely 'multiple-source,' 'uncertainty,' 'development,' and 'justification.' COLB is further divided into 'constructivist' and 'reproductive' conceptions, while SLB represents deep strategies and surface learning strategies. Questionnaire responses were gathered from 303 college students. The results of the confirmatory factor analysis and structural equation modelling showed acceptable model fits. Mediation testing further revealed two paths with complete mediation. In sum, students' epistemic beliefs of 'uncertainty' and 'justification' in biology were statistically significant in explaining the constructivist and reproductive COLB, respectively; and 'uncertainty' was statistically significant in explaining the deep SLB as well. The results of mediation testing further revealed that 'uncertainty' predicted surface strategies through the mediation of 'reproductive' conceptions; and the relationship between 'justification' and deep strategies was mediated by 'constructivist' COLB. This study provides evidence for the essential roles some epistemic beliefs play in predicting students' learning.

  17. A typology of uncertainty derived from an analysis of critical incidents in medical residents: A mixed methods study.

    PubMed

    Hamui-Sutton, Alicia; Vives-Varela, Tania; Gutiérrez-Barreto, Samuel; Leenen, Iwin; Sánchez-Mendiola, Melchor

    2015-11-04

    Medical uncertainty is inherently related to the practice of the physician and generally affects his or her patient care, job satisfaction, continuing education, as well as the overall goals of the health care system. In this paper, some new types of uncertainty, which extend existing typologies, are identified and the contexts and strategies to deal with them are studied. We carried out a mixed-methods study, consisting of a qualitative and a quantitative phase. For the qualitative study, 128 residents reported critical incidents in their clinical practice and described how they coped with the uncertainty in the situation. Each critical incident was analyzed and the most salient situations, 45 in total, were retained. In the quantitative phase, a distinct group of 120 medical residents indicated for each of these situations whether they have been involved in the described situations and, if so, which coping strategy they applied. The analysis examines the relation between characteristics of the situation and the coping strategies. From the qualitative study, a new typology of uncertainty was derived which distinguishes between technical, conceptual, communicational, systemic, and ethical uncertainty. The quantitative analysis showed that, independently of the type of uncertainty, critical incidents are most frequently resolved by consulting senior physicians (49 % overall), which underscores the importance of the hierarchical relationships in the hospital. The insights gained by this study are combined into an integrative model of uncertainty in medical residencies, which combines the type and perceived level of uncertainty, the strategies employed to deal with it, and context elements such as the actors present in the situation. The model considers the final resolution at each of three levels: the patient, the health system, and the physician's personal level. This study gives insight into how medical residents make decisions under different types of uncertainty, giving account of the context in which the interactions take place and of the strategies used to resolve the incidents. These insights may guide the development of organizational policies that reduce uncertainty and stress in residents during their clinical training.

  18. Uncertainty propagation from raw data to final results. [ALEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1985-01-01

    Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that whichmore » is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown.« less

  19. Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?

    PubMed

    Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian

    2016-09-01

    The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Addressing subjective decision-making inherent in GLUE-based multi-criteria rainfall-runoff model calibration

    NASA Astrophysics Data System (ADS)

    Shafii, Mahyar; Tolson, Bryan; Shawn Matott, L.

    2015-04-01

    GLUE is one of the most commonly used informal methodologies for uncertainty estimation in hydrological modelling. Despite the ease-of-use of GLUE, it involves a number of subjective decisions such as the strategy for identifying the behavioural solutions. This study evaluates the impact of behavioural solution identification strategies in GLUE on the quality of model output uncertainty. Moreover, two new strategies are developed to objectively identify behavioural solutions. The first strategy considers Pareto-based ranking of parameter sets, while the second one is based on ranking the parameter sets based on an aggregated criterion. The proposed strategies, as well as the traditional strategies in the literature, are evaluated with respect to reliability (coverage of observations by the envelope of model outcomes) and sharpness (width of the envelope of model outcomes) in different numerical experiments. These experiments include multi-criteria calibration and uncertainty estimation of three rainfall-runoff models with different number of parameters. To demonstrate the importance of behavioural solution identification strategy more appropriately, GLUE is also compared with two other informal multi-criteria calibration and uncertainty estimation methods (Pareto optimization and DDS-AU). The results show that the model output uncertainty varies with the behavioural solution identification strategy, and furthermore, a robust GLUE implementation would require considering multiple behavioural solution identification strategies and choosing the one that generates the desired balance between sharpness and reliability. The proposed objective strategies prove to be the best options in most of the case studies investigated in this research. Implementing such an approach for a high-dimensional calibration problem enables GLUE to generate robust results in comparison with Pareto optimization and DDS-AU.

  1. What strategy is needed for attaining the EU air quality regulations under future climate change scenarios? A sensitivity analysis over Europe

    NASA Astrophysics Data System (ADS)

    Jiménez-Guerrero, P.; Baró, R.; Gómez-Navarro, J. J.; Lorente-Plazas, R.; García-Valero, J. A.; Hernández, Z.; Montávez, J. P.

    2012-04-01

    A wide number of studies show that several areas over Europe exceed some of the air quality thresholds established in the legislation. These exceedances will become more frequent under future climate change scenarios, since the policies aimed at improving air quality in the EU directives have not accounted for the variations in the climate. Climate change alone will influence the future concentrations of atmospheric pollutants through modifications of gas-phase chemistry, transport, removal, and natural emissions. In this sense, chemistry transport models (CTMs) play a key role in assessing and understanding the emissions abatement plans through the use of sensitivity analysis strategies. These sensitivity analyses characterize the change in model output due to variations in model input parameters. Since the management strategies of air pollutant emission is one of the predominant factors for controlling urban air quality, this work assesses the impact of various emission reduction scenarios in air pollution levels over Europe under two climate change scenarios. The methodology includes the use of a climate version of the meteorological model MM5 coupled with the CHIMERE chemistry transport model. Experiments span the periods 1971-2000, as a reference, and 2071-2100, as two future enhanced greenhouse gas and aerosol scenarios (SRES A2 and B2). The atmospheric simulations have an horizontal resolution of 25 km and 23 vertical layers up to 100 hPa, and are driven by the global climate model ECHO-G . In order to represent the sensitivity of the chemistry and transport of aerosols, tropospheric ozone and other photochemical species, several hypothetical scenarios of emission control have been implemented to quantify the influence of diverse emission sources in the area, such as on-road traffic, port and industrial emissions, among others. The modeling strategy lies on a sensitivity analysis to determine the emission reduction and strategy needed in the target area in order to attain the standards and thresholds set in the European Directive 2008/50/EC. Results depict that the system is able to characterize the exceedances occurring in Europe, mainly related to the maximum 8h moving average exceeding the target value of 120 μg/m3, mainly over southern Europe. Also, compliance of the PM10 daily limit values (50 μg/m3) is not achieved over wide areas in Europe. The sensitivity analysis indicates that large reductions of precursors emissions are needed in all the scenarios examined for attaining the thresholds set in the European Directive. In most cases this abatement strategy is hard to take into practice (e.g. unrealistic percentage of emission reductions in on-road traffic, industry or harbor activity); however, ozone and particulate matter air pollution improve considerably in most of the scenarios included. Results also unveil the propagation of uncertainties from the meteorological projections into future air quality and claim for future studies aimed at deepening the knowledge about the parameterized processes, the definition of emissions and, last, reducing uncertainties.

  2. Performance evaluation of a smart buffer control at a wastewater treatment plant.

    PubMed

    van Daal-Rombouts, P; Benedetti, L; de Jonge, J; Weijers, S; Langeveld, J

    2017-11-15

    Real time control (RTC) is increasingly seen as a viable method to optimise the functioning of wastewater systems. Model exercises and case studies reported in literature claim a positive impact of RTC based on results without uncertainty analysis and flawed evaluation periods. This paper describes two integrated RTC strategies at the wastewater treatment plant (WWTP) Eindhoven, the Netherlands, that aim to improve the use of the available tanks at the WWTP and storage in the contributing catchments to reduce the impact on the receiving water. For the first time it is demonstrated that a significant improvement can be achieved through the application of RTC in practice. The Storm Tank Control is evaluated based on measurements and reduces the number of storm water settling tank discharges by 44% and the discharged volume by an estimated 33%, decreasing dissolved oxygen depletion in the river. The Primary Clarifier Control is evaluated based on model simulations. The maximum event NH4 concentration in the effluent reduced on average 19% for large events, while the load reduced 20%. For all 31 events the reductions are 11 and 4% respectively. Reductions are significant taking uncertainties into account, while using representative evaluation periods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  4. Managing uncertainty: information and insurance under the risk of starvation.

    PubMed Central

    Dall, Sasha R X; Johnstone, Rufus A

    2002-01-01

    In an uncertain world, animals face both unexpected opportunities and danger. Such outcomes can select for two potential strategies: collecting information to reduce uncertainty, or insuring against it. We investigate the relative value of information and insurance (energy reserves) under starvation risk by offering model foragers a choice between constant and varying food sources over finite foraging bouts. We show that sampling the variable option (choosing it when it is not expected to be good) should decline both with lower reserves and late in foraging bouts; in order to be able to reap the reduction in uncertainty associated with exploiting a variable resource effectively, foragers must be able to afford and compensate for an initial increase in the risk of an energetic shortfall associated with choosing the option when it is bad. Consequently, expected exploitation of the varying option increases as it becomes less variable, and when the overall risk of energetic shortfall is reduced. In addition, little activity on the variable alternative is expected until reserves are built up early in a foraging bout. This indicates that gathering information is a luxury while insurance is a necessity, at least when foraging on stochastic and variable food under the risk of starvation. PMID:12495509

  5. Metafitting: Weight optimization for least-squares fitting of PTTI data

    NASA Technical Reports Server (NTRS)

    Douglas, Rob J.; Boulanger, J.-S.

    1995-01-01

    For precise time intercomparisons between a master frequency standard and a slave time scale, we have found it useful to quantitatively compare different fitting strategies by examining the standard uncertainty in time or average frequency. It is particularly useful when designing procedures which use intermittent intercomparisons, with some parameterized fit used to interpolate or extrapolate from the calibrating intercomparisons. We use the term 'metafitting' for the choices that are made before a fitting procedure is operationally adopted. We present methods for calculating the standard uncertainty for general, weighted least-squares fits and a method for optimizing these weights for a general noise model suitable for many PTTI applications. We present the results of the metafitting of procedures for the use of a regular schedule of (hypothetical) high-accuracy frequency calibration of a maser time scale. We have identified a cumulative series of improvements that give a significant reduction of the expected standard uncertainty, compared to the simplest procedure of resetting the maser synthesizer after each calibration. The metafitting improvements presented include the optimum choice of weights for the calibration runs, optimized over a period of a week or 10 days.

  6. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.

  7. A probabilistic approach to examine the impacts of mitigation policies on future global PM emissions from on-road vehicles

    NASA Astrophysics Data System (ADS)

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2012-12-01

    There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?

  8. Waste in the U.S. Health care system: a conceptual framework.

    PubMed

    Bentley, Tanya G K; Effros, Rachel M; Palar, Kartika; Keeler, Emmett B

    2008-12-01

    Health care costs in the United States are much higher than those in industrial countries with similar or better health system performance. Wasteful spending has many undesirable consequences that could be alleviated through waste reduction. This article proposes a conceptual framework to guide researchers and policymakers in evaluating waste, implementing waste-reduction strategies, and reducing the burden of unnecessary health care spending. This article divides health care waste into administrative, operational, and clinical waste and provides an overview of each. It explains how researchers have used both high-level and sector- or procedure-specific comparisons to quantify such waste, and it discusses examples and challenges in both waste measurement and waste reduction. Waste is caused by factors such as health insurance and medical uncertainties that encourage the production of inefficient and low-value services. Various efforts to reduce such waste have encountered challenges, such as the high costs of initial investment, unintended administrative complexities, and trade-offs among patients', payers', and providers' interests. While categorizing waste may help identify and measure general types and sources of waste, successful reduction strategies must integrate the administrative, operational, and clinical components of care, and proceed by identifying goals, changing systemic incentives, and making specific process improvements. Classifying, identifying, and measuring waste elucidate its causes, clarify systemic goals, and specify potential health care reforms that-by improving the market for health insurance and health care-will generate incentives for better efficiency and thus ultimately decrease waste in the U.S. health care system.

  9. Stochastic multi-objective model for optimal energy exchange optimization of networked microgrids with presence of renewable generation under risk-based strategies.

    PubMed

    Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad

    2018-02-01

    The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. How It's Done: Using "Hitch" as a Guide to Uncertainty Reduction Theory

    ERIC Educational Resources Information Center

    Dawkins, Marcia Alesan

    2010-01-01

    Popular films can be important pedagogical tools in today's communication courses. Constructing classroom experiences that use film can make theory come alive for students. At the same time, theory can be used to probe deeper into the complexities of human behavior via astute film analysis. In the case of Uncertainty Reduction Theory (URT), a…

  11. A compression algorithm for the combination of PDF sets.

    PubMed

    Carrazza, Stefano; Latorre, José I; Rojo, Juan; Watt, Graeme

    The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

  12. Monte Carlo dose calculation in dental amalgam phantom

    PubMed Central

    Aziz, Mohd. Zahri Abdul; Yusoff, A. L.; Osman, N. D.; Abdullah, R.; Rabaie, N. A.; Salikin, M. S.

    2015-01-01

    It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax) using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation. PMID:26500401

  13. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. © 2015 Society for Risk Analysis.

  14. Analysis of sheltering and evacuation strategies for an urban nuclear detonation scenario.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, Ann S.; Brandt, Larry D.

    2009-05-01

    Development of an effective strategy for shelter and evacuation is among the most important planning tasks in preparation for response to a low yield, nuclear detonation in an urban area. This study examines shelter-evacuate policies and effectiveness focusing on a 10 kt scenario in Los Angeles. The goal is to provide technical insights that can support development of urban response plans. Results indicate that extended shelter-in-place can offer the most robust protection when high quality shelter exists. Where less effective shelter is available and the fallout radiation intensity level is high, informed evacuation at the appropriate time can substantially reducemore » the overall dose to personnel. However, uncertainties in the characteristics of the fallout region and in the exit route can make evacuation a risky strategy. Analyses indicate that only a relatively small fraction of the total urban population may experience significant dose reduction benefits from even a well-informed evacuation plan.« less

  15. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  16. Evaluation of FAD-associated purse seine fishery reduction strategies for bigeye tuna ( Thunnus obesus) in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Tong, Yuhe; Chen, Xinjun; Xu, Liuxiong; Chen, Yong

    2013-07-01

    In the Indian Ocean, bigeye tuna supports one of the most important fisheries in the world. This fishery mainly consists of two components: longline and purse seine fisheries. Evidence of overfishing and stock depletion of bigeye tuna calls for an evaluation of alternative management strategies. Using an age-structured operating model, parameterized with the results derived in a recent stock assessment, we evaluated the effectiveness of applying constant fishing mortality (CF) and quasi-constant fishing mortality (QCF) strategies to reduce fishing effort of purse seining with fish aggregating devices (FADs) at different rates. Three different levels of productivity accounted for the uncertainty in our understanding of stock productivity. The study shows that the results of CF and QCF are similar. Average SSB and catch during simulation years would be higher if fishing mortality of FAD-associated purse seining was reduced rapidly. The banning or rapid reduction of purse seining with FAD resulted in a mean catch, and catch in the last simulation year, higher than that of the base case in which no change was made to the purse seine fishery. This could be caused by growth overfishing by purse seine fisheries with FADs according to the per-recruit analysis. These differences would be more obvious when stock productivity was low. Transferring efforts of FAD-associated purse seining to longline fisheries is also not feasible. Our study suggests that changes are necessary to improve the performance of the current management strategy.

  17. A Strategy for Uncertainty Visualization Design

    DTIC Science & Technology

    2009-10-01

    143–156, Magdeburg , Germany . [11] Thomson, J., Hetzler, E., MacEachren, A., Gahegan, M. and Pavel, M. (2005), A Typology for Visualizing Uncertainty...and Stasko [20] to bridge analytic gaps in visualization design, when tasks in the strategy overlap (and therefore complement) design frameworks

  18. Uncertainty, robustness, and the value of information in managing an expanding Arctic goose population

    USGS Publications Warehouse

    Johnson, Fred A.; Jensen, Gitte H.; Madsen, Jesper; Williams, Byron K.

    2014-01-01

    We explored the application of dynamic-optimization methods to the problem of pink-footed goose (Anser brachyrhynchus) management in western Europe. We were especially concerned with the extent to which uncertainty in population dynamics influenced an optimal management strategy, the gain in management performance that could be expected if uncertainty could be eliminated or reduced, and whether an adaptive or robust management strategy might be most appropriate in the face of uncertainty. We combined three alternative survival models with three alternative reproductive models to form a set of nine annual-cycle models for pink-footed geese. These models represent a wide range of possibilities concerning the extent to which demographic rates are density dependent or independent, and the extent to which they are influenced by spring temperatures. We calculated state-dependent harvest strategies for these models using stochastic dynamic programming and an objective function that maximized sustainable harvest, subject to a constraint on desired population size. As expected, attaining the largest mean objective value (i.e., the relative measure of management performance) depended on the ability to match a model-dependent optimal strategy with its generating model of population dynamics. The nine models suggested widely varying objective values regardless of the harvest strategy, with the density-independent models generally producing higher objective values than models with density-dependent survival. In the face of uncertainty as to which of the nine models is most appropriate, the optimal strategy assuming that both survival and reproduction were a function of goose abundance and spring temperatures maximized the expected minimum objective value (i.e., maxi–min). In contrast, the optimal strategy assuming equal model weights minimized the expected maximum loss in objective value. The expected value of eliminating model uncertainty was an increase in objective value of only 3.0%. This value represents the difference between the best that could be expected if the most appropriate model were known and the best that could be expected in the face of model uncertainty. The value of eliminating uncertainty about the survival process was substantially higher than that associated with the reproductive process, which is consistent with evidence that variation in survival is more important than variation in reproduction in relatively long-lived avian species. Comparing the expected objective value if the most appropriate model were known with that of the maxi–min robust strategy, we found the value of eliminating uncertainty to be an expected increase of 6.2% in objective value. This result underscores the conservatism of the maxi–min rule and suggests that risk-neutral managers would prefer the optimal strategy that maximizes expected value, which is also the strategy that is expected to minimize the maximum loss (i.e., a strategy based on equal model weights). The low value of information calculated for pink-footed geese suggests that a robust strategy (i.e., one in which no learning is anticipated) could be as nearly effective as an adaptive one (i.e., a strategy in which the relative credibility of models is assessed through time). Of course, an alternative explanation for the low value of information is that the set of population models we considered was too narrow to represent key uncertainties in population dynamics. Yet we know that questions about the presence of density dependence must be central to the development of a sustainable harvest strategy. And while there are potentially many environmental covariates that could help explain variation in survival or reproduction, our admission of models in which vital rates are drawn randomly from reasonable distributions represents a worst-case scenario for management. We suspect that much of the value of the various harvest strategies we calculated is derived from the fact that they are state dependent, such that appropriate harvest rates depend on population abundance and weather conditions, as well as our focus on an infinite time horizon for sustainability.

  19. The uncertainty room: strategies for managing uncertainty in a surgical waiting room.

    PubMed

    Stone, Anne M; Lammers, John C

    2012-01-01

    To describe experiences of uncertainty and management strategies for staff working with families in a hospital waiting room. A 288-bed, nonprofit community hospital in a Midwestern city. Data were collected during individual, semistructured interviews with 3 volunteers, 3 technical staff members, and 1 circulating nurse (n = 7), and during 40 hours of observation in a surgical waiting room. Interview transcripts were analyzed using constant comparative techniques. The surgical waiting room represents the intersection of several sources of uncertainty that families experience. Findings also illustrate the ways in which staff manage the uncertainty of families in the waiting room by communicating support. Staff in surgical waiting rooms are responsible for managing family members' uncertainty related to insufficient information. Practically, this study provided some evidence that staff are expected to help manage the uncertainty that is typical in a surgical waiting room, further highlighting the important role of communication in improving family members' experiences.

  20. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    NASA Astrophysics Data System (ADS)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  1. Effectiveness and cost-effectiveness of different immunization strategies against whooping cough to reduce child morbidity and mortality.

    PubMed

    Rivero-Santana, Amado; Cuéllar-Pompa, Leticia; Sánchez-Gómez, Luis M; Perestelo-Pérez, Lilisbeth; Serrano-Aguilar, Pedro

    2014-03-01

    In the last years there has been a significant increase in reported cases of pertussis in developed countries, in spite of high rates of childhood immunization. Health institutions have recommended different vaccination strategies to reduce child morbidity and mortality: vaccination of adolescents and adults, pregnant women, people in contact with the newborn (cocoon strategy) and health care workers. The aim of this paper is to review the scientific evidence supporting these recommendations. Systematic review on the effectiveness and cost-effectiveness of the above strategies for the reduction of morbidity and mortality from pertussis in infants under 12 months. The electronic databases Medline, PreMedline, Embase, CRD, Cochrane Central, and Trip Database were consulted from 1990 to October 2012. The evidence was assessed using the GRADE system. There were eight studies on the efficacy or safety of the strategies analyzed, and 18 economic evaluations. Direct evidence on the efficacy of these strategies is scarce. Economic evaluations suggest that vaccination of adolescents and adults would be cost-effective, although there is major uncertainty over the parameters used. From the perspective of health technology assessment, there is insufficient evidence to recommend the vaccination strategies evaluated. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. 'In-between' and other reasonable ways to deal with risk and uncertainty: A review article.

    PubMed

    Zinn, Jens O

    2016-11-16

    How people deal with risk and uncertainty has fuelled public and academic debate in recent decades. Researchers have shown that common distinctions between rational and 'irrational' strategies underestimate the complexity of how people approach an uncertain future. I suggested in 2008 that strategies in-between do not follow standards of instrumental rationality nor they are 'irrational' but follow their own logic which works well under particular circumstances. Strategies such as trust, intuition and emotion are an important part of the mix when people deal with risk and uncertainty. In this article, I develop my original argument. It explores in-between strategies to deal with possible undesired outcomes of decisions. I examine 'non-rational strategies' and in particular the notions of active, passive and reflexive hope. Furthermore, I argue that my original typology should be seen as a triangular of reasonable strategies which work well under specific circumstances. Finally, I highlight a number of different ways in which these strategies combine.

  3. Smart EV Energy Management System to Support Grid Services

    NASA Astrophysics Data System (ADS)

    Wang, Bin

    Under smart grid scenarios, the advanced sensing and metering technologies have been applied to the legacy power grid to improve the system observability and the real-time situational awareness. Meanwhile, there is increasing amount of distributed energy resources (DERs), such as renewable generations, electric vehicles (EVs) and battery energy storage system (BESS), etc., being integrated into the power system. However, the integration of EVs, which can be modeled as controllable mobile energy devices, brings both challenges and opportunities to the grid planning and energy management, due to the intermittency of renewable generation, uncertainties of EV driver behaviors, etc. This dissertation aims to solve the real-time EV energy management problem in order to improve the overall grid efficiency, reliability and economics, using online and predictive optimization strategies. Most of the previous research on EV energy management strategies and algorithms are based on simplified models with unrealistic assumptions that the EV charging behaviors are perfectly known or following known distributions, such as the arriving time, leaving time and energy consumption values, etc. These approaches fail to obtain the optimal solutions in real-time because of the system uncertainties. Moreover, there is lack of data-driven strategy that performs online and predictive scheduling for EV charging behaviors under microgrid scenarios. Therefore, we develop an online predictive EV scheduling framework, considering uncertainties of renewable generation, building load and EV driver behaviors, etc., based on real-world data. A kernel-based estimator is developed to predict the charging session parameters in real-time with improved estimation accuracy. The efficacy of various optimization strategies that are supported by this framework, including valley-filling, cost reduction, event-based control, etc., has been demonstrated. In addition, the existing simulation-based approaches do not consider a variety of practical concerns of implementing such a smart EV energy management system, including the driver preferences, communication protocols, data models, and customized integration of existing standards to provide grid services. Therefore, this dissertation also solves these issues by designing and implementing a scalable system architecture to capture the user preferences, enable multi-layer communication and control, and finally improve the system reliability and interoperability.

  4. On the role of budget sufficiency, cost efficiency, and uncertainty in species management

    USGS Publications Warehouse

    van der Burg, Max Post; Bly, Bartholomew B.; Vercauteren, Tammy; Grand, James B.; Tyre, Andrew J.

    2014-01-01

    Many conservation planning frameworks rely on the assumption that one should prioritize locations for management actions based on the highest predicted conservation value (i.e., abundance, occupancy). This strategy may underperform relative to the expected outcome if one is working with a limited budget or the predicted responses are uncertain. Yet, cost and tolerance to uncertainty rarely become part of species management plans. We used field data and predictive models to simulate a decision problem involving western burrowing owls (Athene cunicularia hypugaea) using prairie dog colonies (Cynomys ludovicianus) in western Nebraska. We considered 2 species management strategies: one maximized abundance and the other maximized abundance in a cost-efficient way. We then used heuristic decision algorithms to compare the 2 strategies in terms of how well they met a hypothetical conservation objective. Finally, we performed an info-gap decision analysis to determine how these strategies performed under different budget constraints and uncertainty about owl response. Our results suggested that when budgets were sufficient to manage all sites, the maximizing strategy was optimal and suggested investing more in expensive actions. This pattern persisted for restricted budgets up to approximately 50% of the sufficient budget. Below this budget, the cost-efficient strategy was optimal and suggested investing in cheaper actions. When uncertainty in the expected responses was introduced, the strategy that maximized abundance remained robust under a sufficient budget. Reducing the budget induced a slight trade-off between expected performance and robustness, which suggested that the most robust strategy depended both on one's budget and tolerance to uncertainty. Our results suggest that wildlife managers should explicitly account for budget limitations and be realistic about their expected levels of performance.

  5. Advanced Stochastic Collocation Methods for Polynomial Chaos in RAVEN

    NASA Astrophysics Data System (ADS)

    Talbot, Paul W.

    As experiment complexity in fields such as nuclear engineering continually increases, so does the demand for robust computational methods to simulate them. In many simulations, input design parameters and intrinsic experiment properties are sources of uncertainty. Often small perturbations in uncertain parameters have significant impact on the experiment outcome. For instance, in nuclear fuel performance, small changes in fuel thermal conductivity can greatly affect maximum stress on the surrounding cladding. The difficulty quantifying input uncertainty impact in such systems has grown with the complexity of numerical models. Traditionally, uncertainty quantification has been approached using random sampling methods like Monte Carlo. For some models, the input parametric space and corresponding response output space is sufficiently explored with few low-cost calculations. For other models, it is computationally costly to obtain good understanding of the output space. To combat the expense of random sampling, this research explores the possibilities of using advanced methods in Stochastic Collocation for generalized Polynomial Chaos (SCgPC) as an alternative to traditional uncertainty quantification techniques such as Monte Carlo (MC) and Latin Hypercube Sampling (LHS) methods for applications in nuclear engineering. We consider traditional SCgPC construction strategies as well as truncated polynomial spaces using Total Degree and Hyperbolic Cross constructions. We also consider applying anisotropy (unequal treatment of different dimensions) to the polynomial space, and offer methods whereby optimal levels of anisotropy can be approximated. We contribute development to existing adaptive polynomial construction strategies. Finally, we consider High-Dimensional Model Reduction (HDMR) expansions, using SCgPC representations for the subspace terms, and contribute new adaptive methods to construct them. We apply these methods on a series of models of increasing complexity. We use analytic models of various levels of complexity, then demonstrate performance on two engineering-scale problems: a single-physics nuclear reactor neutronics problem, and a multiphysics fuel cell problem coupling fuels performance and neutronics. Lastly, we demonstrate sensitivity analysis for a time-dependent fuels performance problem. We demonstrate the application of all the algorithms in RAVEN, a production-level uncertainty quantification framework.

  6. Assessment of technologies to meet a low carbon fuel standard.

    PubMed

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  7. Stochastic DG Placement for Conservation Voltage Reduction Based on Multiple Replications Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhaoyu; Chen, Bokan; Wang, Jianhui

    2015-06-01

    Conservation voltage reduction (CVR) and distributed-generation (DG) integration are popular strategies implemented by utilities to improve energy efficiency. This paper investigates the interactions between CVR and DG placement to minimize load consumption in distribution networks, while keeping the lowest voltage level within the predefined range. The optimal placement of DG units is formulated as a stochastic optimization problem considering the uncertainty of DG outputs and load consumptions. A sample average approximation algorithm-based technique is developed to solve the formulated problem effectively. A multiple replications procedure is developed to test the stability of the solution and calculate the confidence interval ofmore » the gap between the candidate solution and optimal solution. The proposed method has been applied to the IEEE 37-bus distribution test system with different scenarios. The numerical results indicate that the implementations of CVR and DG, if combined, can achieve significant energy savings.« less

  8. Conservative strategy-based ensemble surrogate model for optimal groundwater remediation design at DNAPLs-contaminated sites

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo

    2017-08-01

    The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.

  9. A decision-support tool to inform Australian strategies for preventing suicide and suicidal behaviour.

    PubMed

    Page, Andrew; Atkinson, Jo-An; Heffernan, Mark; McDonnell, Geoff; Hickie, Ian

    2017-04-27

    Dynamic simulation modelling is increasingly being recognised as a valuable decision-support tool to help guide investments and actions to address complex public health issues such as suicide. In particular, participatory system dynamics (SD) modelling provides a useful tool for asking high-level 'what if' questions, and testing the likely impacts of different combinations of policies and interventions at an aggregate level before they are implemented in the real world. We developed an SD model for suicide prevention in Australia, and investigated the hypothesised impacts over the next 10 years (2015-2025) of a combination of current intervention strategies proposed for population interventions in Australia: 1) general practitioner (GP) training, 2) coordinated aftercare in those who have attempted suicide, 3) school-based mental health literacy programs, 4) brief-contact interventions in hospital settings, and 5) psychosocial treatment approaches. Findings suggest that the largest reductions in suicide were associated with GP training (6%) and coordinated aftercare approaches (4%), with total reductions of 12% for all interventions combined. This paper highlights the value of dynamic modelling methods for managing complexity and uncertainty, and demonstrates their potential use as a decision-support tool for policy makers and program planners for community suicide prevention actions.

  10. Participatory Water Resources Modeling in a Water-Scarce Basin (Rio Sonora, Mexico) Reveals Uncertainty in Decision-Making

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Vivoni, E. R.; Halvorsen, K. E.; Kossak, D.

    2014-12-01

    The Rio Sonora Basin (RSB) in northwest Mexico has a semi-arid and highly variable climate along with urban and agricultural pressures on water resources. Three participatory modeling workshops were held in the RSB in spring 2013. A model of the water resources system, consisting of a watershed hydrology model, a model of the water infrastructure, and groundwater models, was developed deliberatively in the workshops, along with scenarios of future climate and development. Participants were asked to design water resources management strategies by choosing from a range of supply augmentation and demand reduction measures associated with water conservation. Participants assessed water supply reliability, measured as the average daily supply divided by daily demand for historical and future periods, by probing with the climate and development scenarios. Pre- and post-workshop-surveys were developed and administered, based on conceptual models of workshop participants' beliefs regarding modeling and local water resources. The survey results indicate that participants believed their modeling abilities increased and beliefs in the utility of models increased as a result of the workshops. The selected water resources strategies varied widely among participants. Wastewater reuse for industry and aquifer recharge were popular options, but significant numbers of participants thought that inter-basin transfers and desalination were viable. The majority of participants indicated that substantial increases in agricultural water efficiency could be achieved. On average, participants chose strategies that produce reliabilities over the historical and future periods of 95%, but more than 20% of participants were apparently satisfied with reliabilities lower than 80%. The wide range of strategies chosen and associated reliabilities indicate that there is a substantial degree of uncertainty in how future water resources decisions could be made in the region.

  11. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  12. Effectiveness of mitigation measures in reducing future primary particulate matter emissions from on-road vehicle exhaust.

    PubMed

    Yan, Fang; Bond, Tami C; Streets, David G

    2014-12-16

    This work evaluates the effectiveness of on-road primary particulate matter emission reductions that can be achieved by long-term vehicle scrappage and retrofit measures on regional and global levels. Scenario analysis shows that scrappage can provide significant emission reductions as soon as the measures begin, whereas retrofit provides greater emission reductions in later years, when more advanced technologies become available in most regions. Reductions are compared with a baseline that already accounts for implementation of clean vehicle standards. The greatest global emission reductions from a scrappage program occur 5 to 10 years after its introduction and can reach as much as 70%. The greatest reductions with retrofit occur around 2030 and range from 16-31%. Monte Carlo simulations are used to evaluate how uncertainties in the composition of the vehicle fleet affect predicted reductions. Scrappage and retrofit reduce global emissions by 22-60% and 15-31%, respectively, within 95% confidence intervals, under a midrange scenario in the year 2030. The simulations provide guidance about which strategies are most effective for specific regions. Retrofit is preferable for high-income regions. For regions where early emission standards are in place, scrappage is suggested, followed by retrofit after more advanced emission standards are introduced. The early implementation of advanced emission standards is recommended for Western and Eastern Africa.

  13. Effectiveness of Mitigation Measures in Reducing Future Primary Particulate Matter Emissions from On-Road Vehicle Exhaust

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Fang; Bond, Tami C.; Streets, David G.

    This work evaluates the effectiveness of on-road primary particulate matter emission reductions that can be achieved by long-term vehicle scrappage and retrofit measures on regional and global levels. Scenario analysis shows that scrappage can provide significant emission reductions as soon as the measures begin, whereas retrofit provides greater emission reductions in later years, when more advanced technologies become available in most regions. Reductions are compared with a baseline that already accounts for implementation of clean vehicle standards. The greatest global emission reductions from a scrappage program occur 5 to 10 years after its introduction and can reach as much asmore » 70%. The greatest reductions with retrofit occur around 2030 and range from 16-31%. Monte Carlo simulations are used to evaluate how uncertainties in the composition of the vehicle fleet affect predicted reductions. Scrappage and retrofit reduce global emissions by 22-60% and 15-31%, respectively, within 95% confidence intervals, under a midrange scenario in the year 2030. The simulations provide guidance about which strategies are most effective for specific regions. Retrofit is preferable for high-income regions. For regions where early emission standards are in place, scrappage is suggested, followed by retrofit after more advanced emission standards are introduced. The early implementation of advanced emission standards is recommended for Western and Eastern Africa« less

  14. Uncertainties in the governance of animal disease: an interdisciplinary framework for analysis

    PubMed Central

    Fish, Robert; Austin, Zoe; Christley, Robert; Haygarth, Philip M.; Heathwaite, Louise A.; Latham, Sophia; Medd, William; Mort, Maggie; Oliver, David M.; Pickup, Roger; Wastling, Jonathan M.; Wynne, Brian

    2011-01-01

    Uncertainty is an inherent feature of strategies to contain animal disease. In this paper, an interdisciplinary framework for representing strategies of containment, and analysing how uncertainties are embedded and propagated through them, is developed and illustrated. Analysis centres on persistent, periodic and emerging disease threats, with a particular focus on cryptosporidiosis, foot and mouth disease and avian influenza. Uncertainty is shown to be produced at strategic, tactical and operational levels of containment, and across the different arenas of disease prevention, anticipation and alleviation. The paper argues for more critically reflexive assessments of uncertainty in containment policy and practice. An interdisciplinary approach has an important contribution to make, but is absent from current real-world containment policy. PMID:21624922

  15. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef M.; Coles, T.; Spantini, A.

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local andmore » long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty in this context raised fundamentally new issues, e.g., how is the topology of slow manifolds transformed by parametric uncertainty? How to construct dynamical models on these uncertain manifolds? To address these questions, we used stochastic spectral polynomial chaos (PC) methods to reformulate uncertain network models and analyzed them using CSP in probabilistic terms. Finding uncertain manifolds involved the solution of stochastic eigenvalue problems, facilitated by projection onto PC bases. These problems motivated us to explore the spectral properties stochastic Galerkin systems. We also introduced novel methods for rank-reduction in stochastic eigensystems—transformations of a uncertain dynamical system that lead to lower storage and solution complexity. These technical accomplishments are detailed below. This report focuses on the MIT portion of the joint project.« less

  16. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  17. Reduction of the uncertainties in the water level-discharge relation of a 1D hydraulic model in the context of operational flood forecasting

    NASA Astrophysics Data System (ADS)

    Habert, J.; Ricci, S.; Le Pape, E.; Thual, O.; Piacentini, A.; Goutal, N.; Jonville, G.; Rochoux, M.

    2016-01-01

    This paper presents a data-driven hydrodynamic simulator based on the 1-D hydraulic solver dedicated to flood forecasting with lead time of an hour up to 24 h. The goal of the study is to reduce uncertainties in the hydraulic model and thus provide more reliable simulations and forecasts in real time for operational use by the national hydrometeorological flood forecasting center in France. Previous studies have shown that sequential assimilation of water level or discharge data allows to adjust the inflows to the hydraulic network resulting in a significant improvement of the discharge while leaving the water level state imperfect. Two strategies are proposed here to improve the water level-discharge relation in the model. At first, a modeling strategy consists in improving the description of the river bed geometry using topographic and bathymetric measurements. Secondly, an inverse modeling strategy proposes to locally correct friction coefficients in the river bed and the flood plain through the assimilation of in situ water level measurements. This approach is based on an Extended Kalman filter algorithm that sequentially assimilates data to infer the upstream and lateral inflows at first and then the friction coefficients. It provides a time varying correction of the hydrological boundary conditions and hydraulic parameters. The merits of both strategies are demonstrated on the Marne catchment in France for eight validation flood events and the January 2004 flood event is used as an illustrative example throughout the paper. The Nash-Sutcliffe criterion for water level is improved from 0.135 to 0.832 for a 12-h forecast lead time with the data assimilation strategy. These developments have been implemented at the SAMA SPC (local flood forecasting service in the Haute-Marne French department) and used for operational forecast since 2013. They were shown to provide an efficient tool for evaluating flood risk and to improve the flood early warning system. Complementary with the deterministic forecast of the hydraulic state, the estimation of an uncertainty range is given relying on off-line and on-line diagnosis. The possibilities to further extend the control vector while limiting the computational cost and equifinality problem are finally discussed.

  18. Decision strategies for handling the uncertainty of future extreme rainfall under the influence of climate change.

    PubMed

    Gregersen, I B; Arnbjerg-Nielsen, K

    2012-01-01

    Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.

  19. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40-85% reduction in 1-NSE, and 35-90% reduction in |RB|. Overall, this uncertainty quantification framework is robust, effective and efficient for parametric uncertainty analysis, the results of which provide useful information that helps to understand the model behaviors and improve the model simulations.

  20. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  1. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    USDA-ARS?s Scientific Manuscript database

    Cumulative nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. This study used an agroecosystems simulation model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2...

  2. Which uncertainty? Using expert elicitation and expected value of information to design an adaptive program

    USGS Publications Warehouse

    Runge, Michael C.; Converse, Sarah J.; Lyons, James E.

    2011-01-01

    Natural resource management is plagued with uncertainty of many kinds, but not all uncertainties are equally important to resolve. The promise of adaptive management is that learning in the short-term will improve management in the long-term; that promise is best kept if the focus of learning is on those uncertainties that most impede achievement of management objectives. In this context, an existing tool of decision analysis, the expected value of perfect information (EVPI), is particularly valuable in identifying the most important uncertainties. Expert elicitation can be used to develop preliminary predictions of management response under a series of hypotheses, as well as prior weights for those hypotheses, and the EVPI can be used to determine how much management could improve if uncertainty was resolved. These methods were applied to management of whooping cranes (Grus americana), an endangered migratory bird that is being reintroduced in several places in North America. The Eastern Migratory Population of whooping cranes had exhibited almost no successful reproduction through 2009. Several dozen hypotheses can be advanced to explain this failure, and many of them lead to very different management responses. An expert panel articulated the hypotheses, provided prior weights for them, developed potential management strategies, and made predictions about the response of the population to each strategy under each hypothesis. Multi-criteria decision analysis identified a preferred strategy in the face of uncertainty, and analysis of the expected value of information identified how informative each strategy could be. These results provide the foundation for design of an adaptive management program.

  3. ‘In-between’ and other reasonable ways to deal with risk and uncertainty: A review article

    PubMed Central

    Zinn, Jens O.

    2016-01-01

    How people deal with risk and uncertainty has fuelled public and academic debate in recent decades. Researchers have shown that common distinctions between rational and ‘irrational’ strategies underestimate the complexity of how people approach an uncertain future. I suggested in 2008 that strategies in-between do not follow standards of instrumental rationality nor they are ‘irrational’ but follow their own logic which works well under particular circumstances. Strategies such as trust, intuition and emotion are an important part of the mix when people deal with risk and uncertainty. In this article, I develop my original argument. It explores in-between strategies to deal with possible undesired outcomes of decisions. I examine ‘non-rational strategies’ and in particular the notions of active, passive and reflexive hope. Furthermore, I argue that my original typology should be seen as a triangular of reasonable strategies which work well under specific circumstances. Finally, I highlight a number of different ways in which these strategies combine. PMID:28392747

  4. Using analogues to quantify geological uncertainty in stochastic reserve modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, B.; Brown, I.

    1995-08-01

    The petroleum industry seeks to minimize exploration risk by employing the best possible expertise, methods and tools. Is it possible to quantify the success of this process of risk reduction? Due to inherent uncertainty in predicting geological reality and due to changing environments for hydrocarbon exploration, it is not enough simply to record the proportion of successful wells drilled; in various parts of the world it has been noted that pseudo-random drilling would apparently have been as successful as the actual drilling programme. How, then, should we judge the success of risk reduction? For many years the E&P industry hasmore » routinely used Monte Carlo modelling to generate a probability distribution for prospect reserves. One aspect of Monte Carlo modelling which has received insufficient attention, but which is essential for quantifying risk reduction, is the consistency and repeatability with which predictions can be made. Reducing the subjective element inherent in the specification of geological uncertainty allows better quantification of uncertainty in the prediction of reserves, in both exploration and appraisal. Building on work reported at the AAPG annual conventions in 1994 and 1995, the present paper incorporates analogue information with uncertainty modelling. Analogues provide a major step forward in the quantification of risk, but their significance is potentially greater still. The two principal contributors to uncertainty in field and prospect analysis are the hydrocarbon life-cycle and the geometry of the trap. These are usually treated separately. Combining them into a single model is a major contribution to the reduction risk. This work is based in part on a joint project with Oryx Energy UK Ltd., and thanks are due in particular to Richard Benmore and Mike Cooper.« less

  5. User's guide for ALEX: uncertainty propagation from raw data to final results for ORELA transmission measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1984-02-01

    This report describes a computer code (ALEX) developed to assist in AnaLysis of EXperimental data at the Oak Ridge Electron Linear Accelerator (ORELA). Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure; propagation of experimental uncertainties through that reduction procedure has in the past been viewed as even more difficult - if not impossible. The purpose of the code ALEX is to correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional inputmore » from the eperimentalist beyond that which is required for the data reduction itself. This report describes ALEX in detail, with special attention given to the case of transmission measurements (the code itself is applicable, with few changes, to any type of data). Application to the natural iron measurements of D.C. Larson et al. is described in some detail.« less

  6. Experimental Research Examining How People Can Cope with Uncertainty Through Soft Haptic Sensations.

    PubMed

    van Horen, Femke; Mussweiler, Thomas

    2015-09-16

    Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal events (e.g., economical crises, political revolutions, terrorism threats) of which one is unable to judge the impact on one's future live, cognitive strategies (like seeking additional information) is likely to fail to combat uncertainty. Instead, the current paper discusses a method demonstrating that people might deal with uncertainty experientially through soft haptic sensations. More specifically, because touching something soft creates a feeling of comfort and security, people prefer objects with softer as compared to harder properties when feeling uncertain. Seeking for softness is a highly efficient and effective tool to deal with uncertainty as our hands are available at all times. This protocol describes a set of methods demonstrating 1) how environmental (un)certainty can be situationally activated with an experiential priming procedure, 2) that the quality of the softness experience (what type of softness and how it is experienced) matters and 3) how uncertainty can be reduced using different methods.

  7. Waste in the U.S. Health Care System: A Conceptual Framework

    PubMed Central

    Bentley, Tanya G K; Effros, Rachel M; Palar, Kartika; Keeler, Emmett B

    2008-01-01

    Context Health care costs in the United States are much higher than those in industrial countries with similar or better health system performance. Wasteful spending has many undesirable consequences that could be alleviated through waste reduction. This article proposes a conceptual framework to guide researchers and policymakers in evaluating waste, implementing waste-reduction strategies, and reducing the burden of unnecessary health care spending. Methods This article divides health care waste into administrative, operational, and clinical waste and provides an overview of each. It explains how researchers have used both high-level and sector- or procedure-specific comparisons to quantify such waste, and it discusses examples and challenges in both waste measurement and waste reduction. Findings Waste is caused by factors such as health insurance and medical uncertainties that encourage the production of inefficient and low-value services. Various efforts to reduce such waste have encountered challenges, such as the high costs of initial investment, unintended administrative complexities, and trade-offs among patients', payers', and providers' interests. While categorizing waste may help identify and measure general types and sources of waste, successful reduction strategies must integrate the administrative, operational, and clinical components of care, and proceed by identifying goals, changing systemic incentives, and making specific process improvements. Conclusions Classifying, identifying, and measuring waste elucidate its causes, clarify systemic goals, and specify potential health care reforms that—by improving the market for health insurance and health care—will generate incentives for better efficiency and thus ultimately decrease waste in the U.S. health care system. PMID:19120983

  8. Can conservation funding be left to carbon finance? Evidence from participatory future land use scenarios in Peru, Indonesia, Tanzania, and Mexico

    NASA Astrophysics Data System (ADS)

    Ravikumar, Ashwin; Larjavaara, Markku; Larson, Anne; Kanninen, Markku

    2017-01-01

    Revenues derived from carbon have been seen as an important tool for supporting forest conservation over the past decade. At the same time, there is high uncertainty about how much revenue can reasonably be expected from land use emissions reductions initiatives. Despite this uncertainty, REDD+ projects and conservation initiatives that aim to take advantage of available or, more commonly, future funding from carbon markets have proliferated. This study used participatory multi-stakeholder workshops to develop divergent future scenarios of land use in eight landscapes in four countries around the world: Peru, Indonesia, Tanzania, and Mexico. The results of these future scenario building exercises were analyzed using a new tool, CarboScen, for calculating the landscape carbon storage implications of different future land use scenarios. The findings suggest that potential revenues from carbon storage or emissions reductions are significant in some landscapes (most notably the peat forests of Indonesia), and much less significant in others (such as the low-carbon forests of Zanzibar and the interior of Tanzania). The findings call into question the practicality of many conservation programs that hinge on expectations of future revenue from carbon finance. The future scenarios-based approach is useful to policy-makers and conservation program developers in distinguishing between landscapes where carbon finance can substantially support conservation, and landscapes where other strategies for conservation and land use should be prioritized.

  9. Cost-effectiveness analysis of salt reduction policies to reduce coronary heart disease in Syria, 2010-2020.

    PubMed

    Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim

    2015-01-01

    This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.

  10. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  11. Slepton pair production at the LHC in NLO+NLL with resummation-improved parton densities

    NASA Astrophysics Data System (ADS)

    Fiaschi, Juri; Klasen, Michael

    2018-03-01

    Novel PDFs taking into account resummation-improved matrix elements, albeit only in the fit of a reduced data set, allow for consistent NLO+NLL calculations of slepton pair production at the LHC. We apply a factorisation method to this process that minimises the effect of the data set reduction, avoids the problem of outlier replicas in the NNPDF method for PDF uncertainties and preserves the reduction of the scale uncertainty. For Run II of the LHC, left-handed selectron/smuon, right-handed and maximally mixed stau production, we confirm that the consistent use of threshold-improved PDFs partially compensates the resummation contributions in the matrix elements. Together with the reduction of the scale uncertainty at NLO+NLL, the described method further increases the reliability of slepton pair production cross sections at the LHC.

  12. The wicked problem of earthquake hazard in developing countries: the example of Bangladesh

    NASA Astrophysics Data System (ADS)

    Steckler, M. S.; Akhter, S. H.; Stein, S.; Seeber, L.

    2017-12-01

    Many developing nations in earthquake-prone areas confront a tough problem: how much of their limited resources to use mitigating earthquake hazards? This decision is difficult because it is unclear when an infrequent major earthquake may happen, how big it could be, and how much harm it may cause. This issue faces nations with profound immediate needs and ongoing rapid urbanization. Earthquake hazard mitigation in Bangladesh is a wicked problem. It is the world's most densely populated nation, with 160 million people in an area the size of Iowa. Complex geology and sparse data make assessing a possibly-large earthquake hazard difficult. Hence it is hard to decide how much of the limited resources available should be used for earthquake hazard mitigation, given other more immediate needs. Per capita GDP is $1200, so Bangladesh is committed to economic growth and resources are needed to address many critical challenges and hazards. In their subtropical environment, rural Bangladeshis traditionally relied on modest mud or bamboo homes. Their rapidly growing, crowded capital, Dhaka, is filled with multistory concrete buildings likely to be vulnerable to earthquakes. The risk is compounded by the potential collapse of services and accessibility after a major temblor. However, extensive construction as the population shifts from rural to urban provides opportunity for earthquake-risk reduction. While this situation seems daunting, it is not hopeless. Robust risk management is practical, even for developing nations. It involves recognizing uncertainties and developing policies that should give a reasonable outcome for a range of the possible hazard and loss scenarios. Over decades, Bangladesh has achieved a thousandfold reduction in risk from tropical cyclones by building shelters and setting up a warning system. Similar efforts are underway for earthquakes. Smart investments can be very effective, even if modest. Hence, we suggest strategies consistent with high uncertainty and limited resources. The most crucial steps are enforcing building codes and public education on earthquake risk reduction. Requiring moderate investments that increases building costs by 5-10% can substantially improve safety and is a cost effective strategy. Over time, natural building turnover will make communities more resilient.

  13. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  14. The MeteoMet2 project—highlights and results

    NASA Astrophysics Data System (ADS)

    Merlone, A.; Sanna, F.; Beges, G.; Bell, S.; Beltramino, G.; Bojkovski, J.; Brunet, M.; del Campo, D.; Castrillo, A.; Chiodo, N.; Colli, M.; Coppa, G.; Cuccaro, R.; Dobre, M.; Drnovsek, J.; Ebert, V.; Fernicola, V.; Garcia-Benadí, A.; Garcia-Izquierdo, C.; Gardiner, T.; Georgin, E.; Gonzalez, A.; Groselj, D.; Heinonen, M.; Hernandez, S.; Högström, R.; Hudoklin, D.; Kalemci, M.; Kowal, A.; Lanza, L.; Miao, P.; Musacchio, C.; Nielsen, J.; Nogueras-Cervera, M.; Oguz Aytekin, S.; Pavlasek, P.; de Podesta, M.; Rasmussen, M. K.; del-Río-Fernández, J.; Rosso, L.; Sairanen, H.; Salminen, J.; Sestan, D.; Šindelářová, L.; Smorgon, D.; Sparasci, F.; Strnad, R.; Underwood, R.; Uytun, A.; Voldan, M.

    2018-02-01

    Launched in 2011 within the European Metrology Research Programme (EMRP) of EURAMET, the joint research project ‘MeteoMet’—Metrology for Meteorology—is the largest EMRP consortium; national metrology institutes, universities, meteorological and climate agencies, research institutes, collaborators and manufacturers are working together, developing new metrological techniques, as well as improving existing ones, for use in meteorological observations and climate records. The project focuses on humidity in the upper and surface atmosphere, air temperature, surface and deep-sea temperatures, soil moisture, salinity, permafrost temperature, precipitation, and the snow albedo effect on air temperature. All tasks are performed using a rigorous metrological approach and include the design and study of new sensors, new calibration facilities, the investigation of sensor characteristics, improved techniques for measurements of essential climate variables with uncertainty evaluation, traceability, laboratory proficiency and the inclusion of field influencing parameters, long-lasting measurements, and campaigns in remote and extreme areas. The vision for MeteoMet is to take a step further towards establishing full data comparability, coherency, consistency, and long-term continuity, through a comprehensive evaluation of the measurement uncertainties for the quantities involved in the global climate observing systems and the derived observations. The improvement in quality of essential climate variables records, through the inclusion of measurement uncertainty budgets, will also highlight possible strategies for the reduction of the uncertainty. This contribution presents selected highlights of the MeteoMet project and reviews the main ongoing activities, tasks and deliverables, with a view to its possible future evolution and extended impact.

  15. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  16. Controller Strategies for Automation Tool Use under Varying Levels of Trajectory Prediction Uncertainty

    NASA Technical Reports Server (NTRS)

    Morey, Susan; Prevot, Thomas; Mercer, Joey; Martin, Lynne; Bienert, Nancy; Cabrall, Christopher; Hunt, Sarah; Homola, Jeffrey; Kraut, Joshua

    2013-01-01

    A human-in-the-loop simulation was conducted to examine the effects of varying levels of trajectory prediction uncertainty on air traffic controller workload and performance, as well as how strategies and the use of decision support tools change in response. This paper focuses on the strategies employed by two controllers from separate teams who worked in parallel but independently under identical conditions (airspace, arrival traffic, tools) with the goal of ensuring schedule conformance and safe separation for a dense arrival flow in en route airspace. Despite differences in strategy and methods, both controllers achieved high levels of schedule conformance and safe separation. Overall, results show that trajectory uncertainties introduced by wind and aircraft performance prediction errors do not affect the controllers' ability to manage traffic. Controller strategies were fairly robust to changes in error, though strategies were affected by the amount of delay to absorb (scheduled time of arrival minus estimated time of arrival). Using the results and observations, this paper proposes an ability to dynamically customize the display of information including delay time based on observed error to better accommodate different strategies and objectives.

  17. Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania.

    PubMed

    Mangone, Emily R; Agarwal, Smisha; L'Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert

    2016-01-01

    There is increasing evidence that mobile phone health interventions ("mHealth") can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program's reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact.

  18. Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania

    PubMed Central

    Mangone, Emily R.; Agarwal, Smisha; L’Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert

    2016-01-01

    Background There is increasing evidence that mobile phone health interventions (“mHealth”) can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. Methods We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. Results In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. Conclusions These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program’s reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact. PMID:26824747

  19. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  20. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  1. Anticipatory strategies of team-handball goalkeepers.

    PubMed

    Gutierrez-Davila, Marcos; Rojas, F Javier; Ortega, Manuel; Campos, Jose; Parraga, Juan

    2011-09-01

    This study seeks to discover whether handball goalkeepers employ a general anticipatory strategy when facing long distance throws and the effect of uncertainty on these strategies. Seven goalkeepers and four throwers took part. We used a force platform to analyse the goalkeeper's movements on the basis of reaction forces and two video cameras synchronised at 500 Hz to film the throw using 3D video techniques. The goalkeepers initiated their movement towards the side of the throw 193 ± 67 ms before the release of the ball and when the uncertainty was reduced the time increased to 349 ± 71 ms. The kinematics analysis of their centre of mass indicated that there was an anticipatory strategy of movement with certain modifications when there was greater uncertainty. All the average scores referring to velocity and lateral movement of the goalkeeper's centre of mass are significantly greater than those recorded for the experimental situation with bigger uncertainty. The methodology used has enabled us to tackle the study of anticipation from an analysis of the movement used by goalkeepers to save the ball.

  2. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty.

    PubMed

    Flores-Alsina, Xavier; Rodríguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2008-11-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alternatives is a multi-criteria problem. Activated sludge models are not well characterized and some of the parameters can present uncertainty, e.g. the influent fractions arriving to the facility and the effect of either temperature or toxic compounds on the kinetic parameters, having a strong influence in the model predictions used during the evaluation of the alternatives and affecting the resulting rank of preferences. Using a simplified version of the IWA Benchmark Simulation Model No. 2 as a case study, this article shows the variations in the decision making when the uncertainty in activated sludge model (ASM) parameters is either included or not during the evaluation of WWTP control strategies. This paper comprises two main sections. Firstly, there is the evaluation of six WWTP control strategies using multi-criteria decision analysis setting the ASM parameters at their default value. In the following section, the uncertainty is introduced, i.e. input uncertainty, which is characterized by probability distribution functions based on the available process knowledge. Next, Monte Carlo simulations are run to propagate input through the model and affect the different outcomes. Thus (i) the variation in the overall degree of satisfaction of the control objectives for the generated WWTP control strategies is quantified, (ii) the contributions of environmental, legal, technical and economic objectives to the existing variance are identified and finally (iii) the influence of the relative importance of the control objectives during the selection of alternatives is analyzed. The results show that the control strategies with an external carbon source reduce the output uncertainty in the criteria used to quantify the degree of satisfaction of environmental, technical and legal objectives, but increasing the economical costs and their variability as a trade-off. Also, it is shown how a preliminary selected alternative with cascade ammonium controller becomes less desirable when input uncertainty is included, having simpler alternatives more chance of success.

  3. Chapter 8: Uncertainty assessment for quantifying greenhouse gas sources and sinks

    Treesearch

    Jay Breidt; Stephen M. Ogle; Wendy Powers; Coeli Hoover

    2014-01-01

    Quantifying the uncertainty of greenhouse gas (GHG) emissions and reductions from agriculture and forestry practices is an important aspect of decision�]making for farmers, ranchers and forest landowners as the uncertainty range for each GHG estimate communicates our level of confidence that the estimate reflects the actual balance of GHG exchange between...

  4. Understanding the ways in which health visitors manage anxiety in cross-cultural work: a qualitative study.

    PubMed

    Cuthill, Fiona

    2014-10-01

    This paper is a report of part of a study that explored the ways in which health visitors manage uncertainty and anxiety when working with clients across cultures. Internationally health care professionals are required to deliver a high standard of culturally appropriate care to increasingly diverse communities and yet problems persist. Research evidence informing cultural 'competence' is focused largely around student experience and consequently little is known about the day-to-day experiences of health professionals in diverse community settings. Anxiety and uncertainty are increasingly recognised as important emotions experienced by a variety of health care professionals when working across cultures and yet the ways in which anxiety and uncertainty is managed in practice is less well understood. Grounded theory methodology was used and 21 semi-structured interviews were conducted with participating health visitors in the North East of England between May 2008 and September 2009. All participants described themselves as white. This study identified three different positions adopted by the health visitors to manage uncertainty and anxiety in their work across cultures. Identified as, 'Fixing a culture', 'Reworking the equality agenda' and 'Asserting the professional self', these strategies identify the ways in which health visitors try to manage the uncertainty and anxiety they feel when working in diverse communities. All of these strategies attempt in different ways to negate cultural difference and to render culture as static and known. Given that health professionals report anxiety and uncertainty when working across diverse community settings, identification of the strategies used by health visitors to manage that anxiety is important for both policy and practice. New strategies need to be developed to help health professionals to manage uncertainty and anxiety in ways that promote both cultural safe care and health equity.

  5. Relative effects of antiretroviral therapy and harm reduction initiatives on HIV incidence in British Columbia, Canada, 1996-2013: a modelling study.

    PubMed

    Nosyk, Bohdan; Zang, Xiao; Min, Jeong E; Krebs, Emanuel; Lima, Viviane D; Milloy, M-J; Shoveller, Jean; Barrios, Rolando; Harrigan, P Richard; Kerr, Thomas; Wood, Evan; Montaner, Julio S G

    2017-07-01

    Antiretroviral therapy (ART) and harm reduction services have been cited as key contributors to control of HIV epidemics; however, the specific contribution of ART has been questioned due to uncertainty of its true efficacy on HIV transmission through needle sharing. We aimed to isolate the independent effects of harm reduction services (opioid agonist treatment uptake and needle distribution volumes) and ART on HIV transmission via needle sharing in British Columbia, Canada, from 1996 to 2013. We used comprehensive linked individual health administrative and registry data for the population of diagnosed people living with HIV in British Columbia to populate a dynamic, compartmental transmission model to simulate the HIV/AIDS epidemic in British Columbia from 1996 to 2013. We estimated HIV incidence, mortality, and quality-adjusted life-years (QALYs). We also estimated scenarios designed to isolate the independent effects of harm reduction services and ART, assuming 50% (10-90%) efficacy, in reducing HIV incidence through needle sharing, and we investigated structural and parameter uncertainty. We estimate that 3204 (upper bound-lower bound 2402-4589) incident HIV cases were averted between 1996 and 2013 as a result of the combined effect of the expansion of harm reduction services and ART coverage on HIV transmission via needle sharing. In a hypothetical scenario assuming ART had zero effect on transmission through needle sharing, we estimated harm reduction services alone would have accounted for 77% (upper bound-lower bound 62-95%) of averted HIV incidence. In a separate hypothetical scenario where harm reduction services remained at 1996 levels, we estimated ART alone would have accounted for 44% (10-67%) of averted HIV incidence. As a result of high distribution volumes, needle distribution predominantly accounted for incidence reductions attributable to harm reduction but opioid agonist treatment provided substantially greater QALY gains. If the true efficacy of ART in preventing HIV transmission through needle sharing is closer to its efficacy in sexual transmission, ART's effect on incident cases averted could be greater than that of harm reduction. Nonetheless, harm reduction services had a vital role in reducing HIV incidence in British Columbia, and should be viewed as essential and cost-effective tools in combination implementation strategies to reduce the public health and economic burden of HIV/AIDS. BC Ministry of Health; National Institutes of Health (R01DA041747); Genome Canada (142HIV). Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen models.

  7. DNAPL distribution in the source zone: Effect of soil structure and uncertainty reduction with increased sampling density

    NASA Astrophysics Data System (ADS)

    Pantazidou, Marina; Liu, Ke

    2008-02-01

    This paper focuses on parameters describing the distribution of dense nonaqueous phase liquid (DNAPL) contaminants and investigates the variability of these parameters that results from soil heterogeneity. In addition, it quantifies the uncertainty reduction that can be achieved with increased density of soil sampling. Numerical simulations of DNAPL releases were performed using stochastic realizations of hydraulic conductivity fields generated with the same geostatistical parameters and conditioning data at two sampling densities, thus generating two simulation ensembles of low and high density (three-fold increase) of soil sampling. The results showed that DNAPL plumes in aquifers identical in a statistical sense exhibit qualitatively different patterns, ranging from compact to finger-like. The corresponding quantitative differences were expressed by defining several alternative measures that describe the DNAPL plume and computing these measures for each simulation of the two ensembles. The uncertainty in the plume features under study was affected to different degrees by the variability of the soil, with coefficients of variation ranging from about 20% to 90%, for the low-density sampling. Meanwhile, the increased soil sampling frequency resulted in reductions of uncertainty varying from 7% to 69%, for low- and high-uncertainty variables, respectively. In view of the varying uncertainty in the characteristics of a DNAPL plume, remedial designs that require estimates of the less uncertain features of the plume may be preferred over others that need a more detailed characterization of the source zone architecture.

  8. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    NASA Astrophysics Data System (ADS)

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  9. Resolving structural uncertainty in natural resources management using POMDP approaches

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    In recent years there has been a growing focus on the uncertainties of natural resources management, and the importance of accounting for uncertainty in assessing management effectiveness. This paper focuses on uncertainty in resource management in terms of discrete-state Markov decision processes (MDP) under structural uncertainty and partial observability. It describes the treatment of structural uncertainty with approaches developed for partially observable resource systems. In particular, I show how value iteration for partially observable MDPs (POMDP) can be extended to structurally uncertain MDPs. A key difference between these process classes is that structurally uncertain MDPs require the tracking of system state as well as a probability structure for the structure uncertainty, whereas with POMDPs require only a probability structure for the observation uncertainty. The added complexity of the optimization problem under structural uncertainty is compensated by reduced dimensionality in the search for optimal strategy. A solution algorithm for structurally uncertain processes is outlined for a simple example in conservation biology. By building on the conceptual framework developed for POMDPs, natural resource analysts and decision makers who confront structural uncertainties in natural resources can take advantage of the rapid growth in POMDP methods and approaches, and thereby produce better conservation strategies over a larger class of resource problems. ?? 2011.

  10. Disturbance observer based model predictive control for accurate atmospheric entry of spacecraft

    NASA Astrophysics Data System (ADS)

    Wu, Chao; Yang, Jun; Li, Shihua; Li, Qi; Guo, Lei

    2018-05-01

    Facing the complex aerodynamic environment of Mars atmosphere, a composite atmospheric entry trajectory tracking strategy is investigated in this paper. External disturbances, initial states uncertainties and aerodynamic parameters uncertainties are the main problems. The composite strategy is designed to solve these problems and improve the accuracy of Mars atmospheric entry. This strategy includes a model predictive control for optimized trajectory tracking performance, as well as a disturbance observer based feedforward compensation for external disturbances and uncertainties attenuation. 500-run Monte Carlo simulations show that the proposed composite control scheme achieves more precise Mars atmospheric entry (3.8 km parachute deployment point distribution error) than the baseline control scheme (8.4 km) and integral control scheme (5.8 km).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.

    Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less

  12. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  13. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    USGS Publications Warehouse

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer

  14. Changes in intolerance of uncertainty during cognitive behavior group therapy for social phobia.

    PubMed

    Mahoney, Alison E J; McEvoy, Peter M

    2012-06-01

    Recent research suggests that intolerance of uncertainty (IU), most commonly associated with generalized anxiety disorder, also contributes to symptoms of social phobia. This study examines the relationship between IU and social anxiety symptoms across treatment. Changes in IU, social anxiety symptoms, and depression symptoms were examined following cognitive behavior group therapy (CBGT) for social phobia (N=32). CBGT led to significant improvements in symptoms of social anxiety and depression, as well as reductions in IU. Reductions in IU were associated with reductions in social anxiety but were unrelated to improvements in depression symptoms. Reductions in IU were predictive of post-treatment social phobia symptoms after controlling for pre-treatment social phobia symptoms and changes in depression symptoms following treatment. The relationship between IU and social anxiety requires further examination within experimental and longitudinal designs, and needs to take into account additional constructs that are thought to maintain social phobia. Current findings suggest that the enhancing tolerance of uncertainty may play a role in the optimal management of social phobia. Theoretical and clinical implications are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Probabilistic Description of the Hydrologic Risk in Agriculture

    NASA Astrophysics Data System (ADS)

    Vico, G.; Porporato, A. M.

    2011-12-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climatic variability on agroecosystems productivity and profitability, at the expenses of increasing water requirements for irrigation purposes. Optimizing water allocation for crop yield preservation and sustainable development needs to account for hydro-climatic variability, which is by far the main source of uncertainty affecting crop yields and irrigation water requirements. In this contribution, a widely applicable probabilistic framework is proposed to quantitatively define the hydrologic risk of yield reduction for both rainfed and irrigated agriculture. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season. Based on these linkages, long-term and real-time yield reduction risk indices are defined as a function of climate, soil and crop parameters, as well as irrigation strategy. The former risk index is suitable for long-term irrigation strategy assessment and investment planning, while the latter risk index provides a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season. This probabilistic framework allows also assessing the impact of limited water availability on crop yield, thus guiding the optimal allocation of water resources for human and environmental needs. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios, thus facilitating the assessment of the impact of increasingly frequent water shortages on agricultural productivity, profitability, and sustainability.

  16. Optimization Under Uncertainty for Wake Steering Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degreemore » of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).« less

  17. Constructing (un-)certainty: An exploration of journalistic decision-making in the reporting of neuroscience.

    PubMed

    Lehmkuhl, Markus; Peters, Hans Peter

    2016-11-01

    Based on 21 individual case studies, this article inventories the ways journalism deals with scientific uncertainty. The study identifies the decisions that impact a journalist's perception of a truth claim as unambiguous or ambiguous and the strategies to deal with uncertainty that arise from this perception. Key for understanding journalistic action is the outcome of three evaluations: What is the story about? How shall the story be told? What type of story is it? We reconstructed the strategies to overcome journalistic decision-making uncertainty in those cases in which they perceived scientific contingency as a problem. Journalism deals with uncertainty by way of omission, by contrasting the conflicting messages or by acknowledging the problem via the structure or language. One finding deserves particular mention: The lack of focus on scientific uncertainty is not only a problem of how journalists perceive and communicate but also a problem of how science communicates. © The Author(s) 2016.

  18. [Social learning as an uncertainty-reduction strategy: an adaptationist approach].

    PubMed

    Nakanishi, Daisuke; Kameda, Tatsuya; Shinada, Mizuho

    2003-04-01

    Social learning is an effective mechanism to reduce uncertainty about environmental knowledge, helping individuals adopt an adaptive behavior in the environment at small cost. Although this is evident for learning about temporally stable targets (e.g., acquiring avoidance of toxic foods culturally), the functional value of social learning in a temporally unstable environment is less clear; knowledge acquired by social learning may be outdated. This paper addressed adaptive values of social learning in a non-stationary environment empirically. When individual learning about the non-stationary environment is costly, a hawk-dove-game-like equilibrium is expected to emerge in the population, where members who engage in costly individual learning and members who skip the information search and free-ride on other members' search efforts coexist at a stable ratio. Such a "producer-scrounger" structure should qualify effectiveness of social/cultural learning severely, especially "conformity bias" when using social information (Boyd & Richerson, 1985). We tested these predictions by an experiment implementing a non-stationary uncertain environment in a laboratory. The results supported our thesis. Implications of these findings and some future directions were discussed.

  19. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  20. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  1. The influence of weight-of-evidence strategies on audience perceptions of (un)certainty when media cover contested science.

    PubMed

    Kohl, Patrice Ann; Kim, Soo Yun; Peng, Yilang; Akin, Heather; Koh, Eun Jeong; Howell, Allison; Dunwoody, Sharon

    2016-11-01

    Controversy in science news accounts attracts audiences and draws attention to important science issues. But sometimes covering multiple sides of a science issue does the audience a disservice. Counterbalancing a truth claim backed by strong scientific support with a poorly backed argument can unnecessarily heighten audience perceptions of uncertainty. At the same time, journalistic norms often constrain reporters to "get both sides of the story" even when there is little debate in the scientific community about which truth claim is most valid. In this study, we look at whether highlighting the way in which experts are arrayed across truth claims-a strategy we label "weight-of-evidence reporting"-can attenuate heightened perceptions of uncertainty that can result from coverage of conflicting claims. The results of our study suggest weight-of-evidence strategies can indeed play a role in reducing some of the uncertainty audiences may perceive when encountering lop-sided truth claims. © The Author(s) 2015.

  2. Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness

    NASA Astrophysics Data System (ADS)

    Irias, X.; Cicala, D.

    2013-12-01

    Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is entirely dependent on four potentially fragile water transmission mains for its day-to-day water supply. Using info-gap analysis, EBMUD is evaluating competing strategies for providing water supply to the island, for example submarine pipelines versus tunnels. The analysis considers not only the likely or 'average' results for each strategy, but also the worst-case performance of each strategy under varying levels of uncertainty. This analysis is improving the quality of the planning process, since it can identify strategies that ensure minimal disruption of water supply following a major earthquake, even if the earthquake and resulting damage fail to conform to our expectations. Results to date are presented, including a discussion of how info-gap analysis complements existing tools for comparing alternative strategies, and how info-gap improves our ability to quantify our tolerance for uncertainty.

  3. Recent changes in terrestrial water storage in the Upper Nile Basin: an evaluation of commonly used gridded GRACE products

    NASA Astrophysics Data System (ADS)

    Shamsudduha, Mohammad; Taylor, Richard G.; Jones, Darren; Longuevergne, Laurent; Owor, Michael; Tindimugaya, Callist

    2017-09-01

    GRACE (Gravity Recovery and Climate Experiment) satellite data monitor large-scale changes in total terrestrial water storage (ΔTWS), providing an invaluable tool where in situ observations are limited. Substantial uncertainty remains, however, in the amplitude of GRACE gravity signals and the disaggregation of TWS into individual terrestrial water stores (e.g. groundwater storage). Here, we test the phase and amplitude of three GRACE ΔTWS signals from five commonly used gridded products (i.e. NASA's GRCTellus: CSR, JPL, GFZ; JPL-Mascons; GRGS GRACE) using in situ data and modelled soil moisture from the Global Land Data Assimilation System (GLDAS) in two sub-basins (LVB: Lake Victoria Basin; LKB: Lake Kyoga Basin) of the Upper Nile Basin. The analysis extends from January 2003 to December 2012, but focuses on a large and accurately observed reduction in ΔTWS of 83 km3 from 2003 to 2006 in the Lake Victoria Basin. We reveal substantial variability in current GRACE products to quantify the reduction of ΔTWS in Lake Victoria that ranges from 80 km3 (JPL-Mascons) to 69 and 31 km3 for GRGS and GRCTellus respectively. Representation of the phase in TWS in the Upper Nile Basin by GRACE products varies but is generally robust with GRGS, JPL-Mascons, and GRCTellus (ensemble mean of CSR, JPL, and GFZ time-series data), explaining 90, 84, and 75 % of the variance respectively in "in situ" or "bottom-up" ΔTWS in the LVB. Resolution of changes in groundwater storage (ΔGWS) from GRACE ΔTWS is greatly constrained by both uncertainty in changes in soil-moisture storage (ΔSMS) modelled by GLDAS LSMs (CLM, NOAH, VIC) and the low annual amplitudes in ΔGWS (e.g. 1.8-4.9 cm) observed in deeply weathered crystalline rocks underlying the Upper Nile Basin. Our study highlights the substantial uncertainty in the amplitude of ΔTWS that can result from different data-processing strategies in commonly used, gridded GRACE products; this uncertainty is disregarded in analyses of ΔTWS and individual stores applying a single GRACE product.

  4. Are head-to-head trials of biologics needed? The role of value of information methods in arthritis research.

    PubMed

    Welton, Nicky J; Madan, Jason; Ades, Anthony E

    2011-09-01

    Reimbursement decisions are typically based on cost-effectiveness analyses. While a cost-effectiveness analysis can identify the optimum strategy, there is usually some degree of uncertainty around this decision. Sources of uncertainty include statistical sampling error in treatment efficacy measures, underlying baseline risk, utility measures and costs, as well as uncertainty in the structure of the model. The optimal strategy is therefore only optimal on average, and a decision to adopt this strategy might still be the wrong decision if all uncertainty could be eliminated. This means that there is a quantifiable expected (average) loss attaching to decisions made under uncertainty, and hence a value in collecting information to reduce that uncertainty. Value of information (VOI) analyses can be used to provide guidance on whether more research would be cost-effective, which particular model inputs (parameters) have the most bearing on decision uncertainty, and can also help with the design and sample size of further research. Here, we introduce the key concepts in VOI analyses, and highlight the inputs required to calculate it. The adoption of the new biologic treatments for RA and PsA tends to be based on placebo-controlled trials. We discuss the possible role of VOI analyses in deciding whether head-to-head comparisons of the biologic therapies should be carried out, illustrating with examples from other fields. We emphasize the need for a model of the natural history of RA and PsA, which reflects a consensus view.

  5. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Treesearch

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  6. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  7. Inviting Uncertainty into the Classroom

    ERIC Educational Resources Information Center

    Beghetto, Ronald A.

    2017-01-01

    Most teachers try to avoid having students experience uncertainty in their schoolwork. But if we want to prepare students to tackle complex problems (and the uncertainty that accompanies such problems), we must give them learning experiences that involve feeling unsure and sometimes even confused. Beghetto presents five strategies that help…

  8. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  9. [Benefit-risk assessment of vaccination strategies].

    PubMed

    Hanslik, Thomas; Boëlle, Pierre Yves

    2007-04-01

    This article summarises the various stages of the risk/benefit assessment of vaccination strategies. Establishing the awaited effectiveness of a vaccination strategy supposes to have an epidemiologic description of the disease to be prevented. The effectiveness of the vaccine strategy will be thus expressed in numbers of cases, hospitalizations or deaths avoided. The effectiveness can be direct, expressed as the reduction of the incidence of the infectious disease in the vaccinated subjects compared to unvaccinated subjects. It can also be indirect, the unvaccinated persons being protected by the suspension in circulation of the pathogenic agent, consecutive to the implementation of the vaccination campaign. The risks of vaccination related to the adverse effects detected during the clinical trials preceding marketing are well quantified, but other risks can occur after marketing: e.g., serious and unexpected adverse effects detected by vaccinovigilance systems, or risk of increase in the age of cases if the vaccination coverage is insufficient. The medico-economic evaluation forms a part of the risks/benefit assessment, by positioning the vaccine strategy comparatively with other interventions for health. Epidemiologic and vaccinovigilance informations must be updated very regularly, which underlines the need for having an operational and reliable real time monitoring system to accompany the vaccination strategies. Lastly, in the context of uncertainty which often accompanies the risks/benefit assessments, it is important that an adapted communication towards the public and the doctors is planned.

  10. Uncertainty quantification of overpressure buildup through inverse modeling of compaction processes in sedimentary basins

    NASA Astrophysics Data System (ADS)

    Colombo, Ivo; Porta, Giovanni M.; Ruffo, Paolo; Guadagnini, Alberto

    2017-03-01

    This study illustrates a procedure conducive to a preliminary risk analysis of overpressure development in sedimentary basins characterized by alternating depositional events of sandstone and shale layers. The approach rests on two key elements: (1) forward modeling of fluid flow and compaction, and (2) application of a model-complexity reduction technique based on a generalized polynomial chaos expansion (gPCE). The forward model considers a one-dimensional vertical compaction processes. The gPCE model is then used in an inverse modeling context to obtain efficient model parameter estimation and uncertainty quantification. The methodology is applied to two field settings considered in previous literature works, i.e. the Venture Field (Scotian Shelf, Canada) and the Navarin Basin (Bering Sea, Alaska, USA), relying on available porosity and pressure information for model calibration. It is found that the best result is obtained when porosity and pressure data are considered jointly in the model calibration procedure. Uncertainty propagation from unknown input parameters to model outputs, such as pore pressure vertical distribution, is investigated and quantified. This modeling strategy enables one to quantify the relative importance of key phenomena governing the feedback between sediment compaction and fluid flow processes and driving the buildup of fluid overpressure in stratified sedimentary basins characterized by the presence of low-permeability layers. The results here illustrated (1) allow for diagnosis of the critical role played by the parameters of quantitative formulations linking porosity and permeability in compacted shales and (2) provide an explicit and detailed quantification of the effects of their uncertainty in field settings.

  11. Explanation-aware computing of the prognosis for breast cancer supported by IK-DCBRC: Technical innovation.

    PubMed

    Khelassi, Abdeldjalil

    2014-01-01

    Active research is being conducted to determine the prognosis for breast cancer. However, the uncertainty is a major obstacle in this domain of medical research. In that context, explanation-aware computing has the potential for providing meaningful interactions between complex medical applications and users, which would ensure a significant reduction of uncertainty and risks. This paper presents an explanation-aware agent, supported by Intensive Knowledge-Distributed Case-Based Reasoning Classifier (IK-DCBRC), to reduce the uncertainty and risks associated with the diagnosis of breast cancer. A meaningful explanation is generated by inferring from a rule-based system according to the level of abstraction and the reasoning traces. The computer-aided detection is conducted by IK-DCBRC, which is a multi-agent system that applies the case-based reasoning paradigm and a fuzzy similarity function for the automatic prognosis by the class of breast tumors, i.e. malignant or benign, from a pattern of cytological images. A meaningful interaction between the physician and the computer-aided diagnosis system, IK-DCBRC, is achieved via an intelligent agent. The physician can observe the trace of reasoning, terms, justifications, and the strategy to be used to decrease the risks and doubts associated with the automatic diagnosis. The capability of the system we have developed was proven by an example in which conflicts were clarified and transparency was ensured. The explanation agent ensures the transparency of the automatic diagnosis of breast cancer supported by IK-DCBRC, which decreases uncertainty and risks and detects some conflicts.

  12. Speed accuracy trade-off under response deadlines

    PubMed Central

    Karşılar, Hakan; Simen, Patrick; Papadakis, Samantha; Balcı, Fuat

    2014-01-01

    Perceptual decision making has been successfully modeled as a process of evidence accumulation up to a threshold. In order to maximize the rewards earned for correct responses in tasks with response deadlines, participants should collapse decision thresholds dynamically during each trial so that a decision is reached before the deadline. This strategy ensures on-time responding, though at the cost of reduced accuracy, since slower decisions are based on lower thresholds and less net evidence later in a trial (compared to a constant threshold). Frazier and Yu (2008) showed that the normative rate of threshold reduction depends on deadline delays and on participants' uncertainty about these delays. Participants should start collapsing decision thresholds earlier when making decisions under shorter deadlines (for a given level of timing uncertainty) or when timing uncertainty is higher (for a given deadline). We tested these predictions using human participants in a random dot motion discrimination task. Each participant was tested in free-response, short deadline (800 ms), and long deadline conditions (1000 ms). Contrary to optimal-performance predictions, the resulting empirical function relating accuracy to response time (RT) in deadline conditions did not decline to chance level near the deadline; nor did the slight decline we typically observed relate to measures of endogenous timing uncertainty. Further, although this function did decline slightly with increasing RT, the decline was explainable by the best-fitting parameterization of Ratcliff's diffusion model (Ratcliff, 1978), whose parameters are constant within trials. Our findings suggest that at the very least, typical decision durations are too short for participants to adapt decision parameters within trials. PMID:25177265

  13. The HTA Risk Analysis Chart: Visualising the Need for and Potential Value of Managed Entry Agreements in Health Technology Assessment.

    PubMed

    Grimm, Sabine Elisabeth; Strong, Mark; Brennan, Alan; Wailoo, Allan J

    2017-12-01

    Recent changes to the regulatory landscape of pharmaceuticals may sometimes require reimbursement authorities to issue guidance on technologies that have a less mature evidence base. Decision makers need to be aware of risks associated with such health technology assessment (HTA) decisions and the potential to manage this risk through managed entry agreements (MEAs). This work develops methods for quantifying risk associated with specific MEAs and for clearly communicating this to decision makers. We develop the 'HTA risk analysis chart', in which we present the payer strategy and uncertainty burden (P-SUB) as a measure of overall risk. The P-SUB consists of the payer uncertainty burden (PUB), the risk stemming from decision uncertainty as to which is the truly optimal technology from the relevant set of technologies, and the payer strategy burden (PSB), the additional risk of approving a technology that is not expected to be optimal. We demonstrate the approach using three recent technology appraisals from the UK National Institute for Health and Clinical Excellence (NICE), each of which considered a price-based MEA. The HTA risk analysis chart was calculated using results from standard probabilistic sensitivity analyses. In all three HTAs, the new interventions were associated with substantial risk as measured by the P-SUB. For one of these technologies, the P-SUB was reduced to zero with the proposed price reduction, making this intervention cost effective with near complete certainty. For the other two, the risk reduced substantially with a much reduced PSB and a slightly increased PUB. The HTA risk analysis chart shows the risk that the healthcare payer incurs under unresolved decision uncertainty and when considering recommending a technology that is not expected to be optimal given current evidence. This allows the simultaneous consideration of financial and data-collection MEA schemes in an easily understood format. The use of HTA risk analysis charts will help to ensure that MEAs are considered within a standard utility-maximising health economic decision-making framework.

  14. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, G.E.; Dawson, J.W.

    1983-10-04

    Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.

  15. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, George E.; Dawson, John W.

    1983-01-01

    Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.

  16. Collaborative decision-analytic framework to maximize resilience of tidal marshes to climate change

    USGS Publications Warehouse

    Thorne, Karen M.; Mattsson, Brady J.; Takekawa, John Y.; Cummings, Jonathan; Crouse, Debby; Block, Giselle; Bloom, Valary; Gerhart, Matt; Goldbeck, Steve; Huning, Beth; Sloop, Christina; Stewart, Mendel; Taylor, Karen; Valoppi, Laura

    2015-01-01

    Decision makers that are responsible for stewardship of natural resources face many challenges, which are complicated by uncertainty about impacts from climate change, expanding human development, and intensifying land uses. A systematic process for evaluating the social and ecological risks, trade-offs, and cobenefits associated with future changes is critical to maximize resilience and conserve ecosystem services. This is particularly true in coastal areas where human populations and landscape conversion are increasing, and where intensifying storms and sea-level rise pose unprecedented threats to coastal ecosystems. We applied collaborative decision analysis with a diverse team of stakeholders who preserve, manage, or restore tidal marshes across the San Francisco Bay estuary, California, USA, as a case study. Specifically, we followed a structured decision-making approach, and we using expert judgment developed alternative management strategies to increase the capacity and adaptability to manage tidal marsh resilience while considering uncertainties through 2050. Because sea-level rise projections are relatively confident to 2050, we focused on uncertainties regarding intensity and frequency of storms and funding. Elicitation methods allowed us to make predictions in the absence of fully compatible models and to assess short- and long-term trade-offs. Specifically we addressed two questions. (1) Can collaborative decision analysis lead to consensus among a diverse set of decision makers responsible for environmental stewardship and faced with uncertainties about climate change, funding, and stakeholder values? (2) What is an optimal strategy for the conservation of tidal marshes, and what strategy is robust to the aforementioned uncertainties? We found that when taking this approach, consensus was reached among the stakeholders about the best management strategies to maintain tidal marsh integrity. A Bayesian decision network revealed that a strategy considering sea-level rise and storms explicitly in wetland restoration planning and designs was optimal, and it was robust to uncertainties about management effectiveness and budgets. We found that strategies that avoided explicitly accounting for future climate change had the lowest expected performance based on input from the team. Our decision-analytic framework is sufficiently general to offer an adaptable template, which can be modified for use in other areas that include a diverse and engaged stakeholder group.

  17. Elucidating the Role of Electron Shuttles in Reductive Transformations in Anaerobic Sediments

    EPA Science Inventory

    Model studies have demonstrated that electron shuttles (ES) such as dissolved organic matter (DOM) can participate in the reduction of organic contaminants; however, much uncertainty exists concerning the significance of this solution phase pathway for contaminant reduction in na...

  18. Optimization Under Uncertainty for Wake Steering Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N

    2017-08-03

    This presentation covers the motivation for this research, optimization under the uncertainty problem formulation, a two-turbine case, the Princess Amalia Wind Farm case, and conclusions and next steps.

  19. Uncertainty, robustness, and the value of information in managing a population of northern bobwhites

    USGS Publications Warehouse

    Johnson, Fred A.; Hagan, Greg; Palmer, William E.; Kemmerer, Michael

    2014-01-01

    The abundance of northern bobwhites (Colinus virginianus) has decreased throughout their range. Managers often respond by considering improvements in harvest and habitat management practices, but this can be challenging if substantial uncertainty exists concerning the cause(s) of the decline. We were interested in how application of decision science could be used to help managers on a large, public management area in southwestern Florida where the bobwhite is a featured species and where abundance has severely declined. We conducted a workshop with managers and scientists to elicit management objectives, alternative hypotheses concerning population limitation in bobwhites, potential management actions, and predicted management outcomes. Using standard and robust approaches to decision making, we determined that improved water management and perhaps some changes in hunting practices would be expected to produce the best management outcomes in the face of uncertainty about what is limiting bobwhite abundance. We used a criterion called the expected value of perfect information to determine that a robust management strategy may perform nearly as well as an optimal management strategy (i.e., a strategy that is expected to perform best, given the relative importance of different management objectives) with all uncertainty resolved. We used the expected value of partial information to determine that management performance could be increased most by eliminating uncertainty over excessive-harvest and human-disturbance hypotheses. Beyond learning about the factors limiting bobwhites, adoption of a dynamic management strategy, which recognizes temporal changes in resource and environmental conditions, might produce the greatest management benefit. Our research demonstrates that robust approaches to decision making, combined with estimates of the value of information, can offer considerable insight into preferred management approaches when great uncertainty exists about system dynamics and the effects of management.

  20. Design of optimal groundwater remediation systems under flexible environmental-standard constraints.

    PubMed

    Fan, Xing; He, Li; Lu, Hong-Wei; Li, Jing

    2015-01-01

    In developing optimal groundwater remediation strategies, limited effort has been exerted to solve the uncertainty in environmental quality standards. When such uncertainty is not considered, either over optimistic or over pessimistic optimization strategies may be developed, probably leading to the formulation of rigid remediation strategies. This study advances a mathematical programming modeling approach for optimizing groundwater remediation design. This approach not only prevents the formulation of over optimistic and over pessimistic optimization strategies but also provides a satisfaction level that indicates the degree to which the environmental quality standard is satisfied. Therefore the approach may be expected to be significantly more acknowledged by the decision maker than those who do not consider standard uncertainty. The proposed approach is applied to a petroleum-contaminated site in western Canada. Results from the case study show that (1) the peak benzene concentrations can always satisfy the environmental standard under the optimal strategy, (2) the pumping rates of all wells decrease under a relaxed standard or long-term remediation approach, (3) the pumping rates are less affected by environmental quality constraints under short-term remediation, and (4) increased flexible environmental standards have a reduced effect on the optimal remediation strategy.

  1. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less

  2. The Development and Evaluation of an Operational Aerobraking Strategy for the Mars 2001 Odyssey Orbiter

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Munk, Michelle M.; Powell, Richard W.

    2002-01-01

    The Mars 2001 Odyssey Orbiter successfully completed the aerobraking phase of its mission on January 11, 2002. This paper discusses the support provided by NASA's Langley Research Center to the navigation team at the Jet Propulsion Laboratory in the planning and operational support of Mars Odyssey Aerobraking. Specifically, the development of a three-degree-of-freedom aerobraking trajectory simulation and its application to pre-flight planning activities as well as operations is described. The importance of running the simulation in a Monte Carlo fashion to capture the effects of mission and atmospheric uncertainties is demonstrated, and the utility of including predictive logic within the simulation that could mimic operational maneuver decision-making is shown. A description is also provided of how the simulation was adapted to support flight operations as both a validation and risk reduction tool and as a means of obtaining a statistical basis for maneuver strategy decisions. This latter application was the first use of Monte Carlo trajectory analysis in an aerobraking mission.

  3. Hotspots of climate change impacts in sub-Saharan Africa and implications for adaptation and development.

    PubMed

    Müller, Christoph; Waha, Katharina; Bondeau, Alberte; Heinke, Jens

    2014-08-01

    Development efforts for poverty reduction and food security in sub-Saharan Africa will have to consider future climate change impacts. Large uncertainties in climate change impact assessments do not necessarily complicate, but can inform development strategies. The design of development strategies will need to consider the likelihood, strength, and interaction of climate change impacts across biosphere properties. We here explore the spread of climate change impact projections and develop a composite impact measure to identify hotspots of climate change impacts, addressing likelihood and strength of impacts. Overlapping impacts in different biosphere properties (e.g. flooding, yields) will not only claim additional capacity to respond, but will also narrow the options to respond and develop. Regions with severest projected climate change impacts often coincide with regions of high population density and poverty rates. Science and policy need to propose ways of preparing these areas for development under climate change impacts. © 2014 John Wiley & Sons Ltd.

  4. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  5. Variance Reduction Factor of Nuclear Data for Integral Neutronics Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiba, G., E-mail: go_chiba@eng.hokudai.ac.jp; Tsuji, M.; Narabayashi, T.

    We propose a new quantity, a variance reduction factor, to identify nuclear data for which further improvements are required to reduce uncertainties of target integral neutronics parameters. Important energy ranges can be also identified with this variance reduction factor. Variance reduction factors are calculated for several integral neutronics parameters. The usefulness of the variance reduction factors is demonstrated.

  6. Uncertainty quantification of effective nuclear interactions

    DOE PAGES

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-02

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  7. Uncertainty quantification of effective nuclear interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  8. H2/H∞ control for grid-feeding converter considering system uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Zhongwen; Zang, Chuanzhi; Zeng, Peng; Yu, Haibin; Li, Shuhui; Fu, Xingang

    2017-05-01

    Three-phase grid-feeding converters are key components to integrate distributed generation and renewable power sources to the power utility. Conventionally, proportional integral and proportional resonant-based control strategies are applied to control the output power or current of a GFC. But, those control strategies have poor transient performance and are not robust against uncertainties and volatilities in the system. This paper proposes a H2/H∞-based control strategy, which can mitigate the above restrictions. The uncertainty and disturbance are included to formulate the GFC system state-space model, making it more accurate to reflect the practical system conditions. The paper uses a convex optimisation method to design the H2/H∞-based optimal controller. Instead of using a guess-and-check method, the paper uses particle swarm optimisation to search a H2/H∞ optimal controller. Several case studies implemented by both simulation and experiment can verify the superiority of the proposed control strategy than the traditional PI control methods especially under dynamic and variable system conditions.

  9. Bias error reduction using ratios to baseline experiments. Heat transfer case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakroun, W.; Taylor, R.P.; Coleman, H.W.

    1993-10-01

    Employing a set of experiments devoted to examining the effect of surface finish (riblets) on convective heat transfer as an example, this technical note seeks to explore the notion that precision uncertainties in experiments can be reduced by repeated trials and averaging. This scheme for bias error reduction can give considerable advantage when parametric effects are investigated experimentally. When the results of an experiment are presented as a ratio with the baseline results, a large reduction in the overall uncertainty can be achieved when all the bias limits in the variables of the experimental result are fully correlated with thosemore » of the baseline case. 4 refs.« less

  10. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  11. Comparison of the genetic algorithm and incremental optimisation routines for a Bayesian inverse modelling based network design

    NASA Astrophysics Data System (ADS)

    Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.

    2018-05-01

    The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest that the best use of resources for the network design problem would be spent in improvement of the prior estimates of the flux uncertainties rather than investing these resources in running a complex evolutionary optimisation algorithm. The authors recommend that, if time and computational resources allow, that multiple optimisation techniques should be used as a part of a comprehensive suite of sensitivity tests when performing such an optimisation exercise. This will provide a selection of best solutions which could be ranked based on their utility and practicality.

  12. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  13. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  14. Aging and loss decision making: increased risk aversion and decreased use of maximizing information, with correlated rationality and value maximization.

    PubMed

    Kurnianingsih, Yoanna A; Sim, Sam K Y; Chee, Michael W L; Mullette-Gillman, O'Dhaniel A

    2015-01-01

    We investigated how adult aging specifically alters economic decision-making, focusing on examining alterations in uncertainty preferences (willingness to gamble) and choice strategies (what gamble information influences choices) within both the gains and losses domains. Within each domain, participants chose between certain monetary outcomes and gambles with uncertain outcomes. We examined preferences by quantifying how uncertainty modulates choice behavior as if altering the subjective valuation of gambles. We explored age-related preferences for two types of uncertainty, risk, and ambiguity. Additionally, we explored how aging may alter what information participants utilize to make their choices by comparing the relative utilization of maximizing and satisficing information types through a choice strategy metric. Maximizing information was the ratio of the expected value of the two options, while satisficing information was the probability of winning. We found age-related alterations of economic preferences within the losses domain, but no alterations within the gains domain. Older adults (OA; 61-80 years old) were significantly more uncertainty averse for both risky and ambiguous choices. OA also exhibited choice strategies with decreased use of maximizing information. Within OA, we found a significant correlation between risk preferences and choice strategy. This linkage between preferences and strategy appears to derive from a convergence to risk neutrality driven by greater use of the effortful maximizing strategy. As utility maximization and value maximization intersect at risk neutrality, this result suggests that OA are exhibiting a relationship between enhanced rationality and enhanced value maximization. While there was variability in economic decision-making measures within OA, these individual differences were unrelated to variability within examined measures of cognitive ability. Our results demonstrate that aging alters economic decision-making for losses through changes in both individual preferences and the strategies individuals employ.

  15. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  16. Influences of sampling size and pattern on the uncertainty of correlation estimation between soil water content and its influencing factors

    NASA Astrophysics Data System (ADS)

    Lai, Xiaoming; Zhu, Qing; Zhou, Zhiwen; Liao, Kaihua

    2017-12-01

    In this study, seven random combination sampling strategies were applied to investigate the uncertainties in estimating the hillslope mean soil water content (SWC) and correlation coefficients between the SWC and soil/terrain properties on a tea + bamboo hillslope. One of the sampling strategies is the global random sampling and the other six are the stratified random sampling on the top, middle, toe, top + mid, top + toe and mid + toe slope positions. When each sampling strategy was applied, sample sizes were gradually reduced and each sampling size contained 3000 replicates. Under each sampling size of each sampling strategy, the relative errors (REs) and coefficients of variation (CVs) of the estimated hillslope mean SWC and correlation coefficients between the SWC and soil/terrain properties were calculated to quantify the accuracy and uncertainty. The results showed that the uncertainty of the estimations decreased as the sampling size increasing. However, larger sample sizes were required to reduce the uncertainty in correlation coefficient estimation than in hillslope mean SWC estimation. Under global random sampling, 12 randomly sampled sites on this hillslope were adequate to estimate the hillslope mean SWC with RE and CV ≤10%. However, at least 72 randomly sampled sites were needed to ensure the estimated correlation coefficients with REs and CVs ≤10%. Comparing with all sampling strategies, reducing sampling sites on the middle slope had the least influence on the estimation of hillslope mean SWC and correlation coefficients. Under this strategy, 60 sites (10 on the middle slope and 50 on the top and toe slopes) were enough to ensure the estimated correlation coefficients with REs and CVs ≤10%. This suggested that when designing the SWC sampling, the proportion of sites on the middle slope can be reduced to 16.7% of the total number of sites. Findings of this study will be useful for the optimal SWC sampling design.

  17. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial concentration help modelers turn a curse into a blessing. The data impacts on uncertainty quantification and reduction are quantified using probability density functions of model parameters obtained from Markov Chain Monte Carlo simulation using the DREAM algorithm. This study provides insights to model calibration, uncertainty quantification, experiment design, and data collection in groundwater reactive transport modeling and other environmental modeling.

  18. Uncertainty and probability for branching selves

    NASA Astrophysics Data System (ADS)

    Lewis, Peter J.

    Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics.

  19. Robust Design Optimization via Failure Domain Bounding

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2007-01-01

    This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.

  20. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    NASA Astrophysics Data System (ADS)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for a hydrogen-oxygen system are relatively simple, thereby resulting in low thermodynamic reference value uncertainties. Hydrocarbon combustors, solid rocket motors and hybrid rocket motors have combustion gases containing complex molecules that will likely have thermodynamic reference values with large uncertainties. Thus, every chemical system should be analyzed in a similar manner as that shown in this work.

  1. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, G.E.; Dawson, J.W.

    1981-02-11

    Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.

  2. Simulating evolution of technology: An aid to energy policy analysis. A case study of strategies to control greenhouse gases in Canada

    NASA Astrophysics Data System (ADS)

    Nyboer, John

    Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.

  3. Effect of information, uncertainty and parameter variability on profits in a queue with various pricing strategies

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Li, Shiyong

    2014-08-01

    This paper presents an unobservable single-server queueing system with three types of uncertainty, where the service rate, or waiting cost or service quality is random variable that may obtain n(n > 2) values. The information about the realised values of parameters is only known to the server. We are concerned about the server's behaviour: revealing or concealing the information to customers. The n-value assumption and the server's behaviour enable us to consider various pricing strategies. In this paper, we analyse the effect of information and uncertainty on profits and make comparisons between the profits under different pricing strategies. Moreover, as for parameter variability reflected by the number of each parameter's possible choices n, we observe the effect of variable n on all types of profits and find that revealing the parameter information can much more benefit the server with the increase of n.

  4. Integrated and adaptive management for sustainable water use along ephemeral rivers under severe uncertainty of future flood regimes

    NASA Astrophysics Data System (ADS)

    Arnold, Sven; Attinger, Sabine; Frank, Karin; Baxter, Peter; Hildebrandt, Anke

    2010-05-01

    Ephemeral rivers are located throughout the world's arid regions. They are characterised by temporary surface flow that strongly varies between seasons and years. Along the river course often a coupled eco-hydrological vegetation-groundwater system has established, which is referred to as linear oasis, reflecting the ecological and socio-economic importance of ephemeral rivers in otherwise dry areas. The Kuiseb River denotes such a linear oasis and is one of the most diversely used environments among the ephemeral rivers in Namibia. Along the entire river course surface runoff and ground water are exploited for drinking, farming, and mining. The middle section of the Kuiseb River is characterised by strong eco-hydrological feedbacks between the vegetation and the ground water resource. Temporary floods infiltrate into sediments, which are accumulated in geological pools of impermeable bedrocks. This enables the formation of shallow ground water. The low depth to ground water allows root water uptake by plants and the establishment of a thriving ecosystem. The sustainable use of ecological and hydrological resources along ephemeral rivers is crucial to preserve the natural ecosystem. However, the investigation of management strategies that consider both the regulation of water extraction and vegetation structure requires models that explicitly consider the feedbacks between the water resource and the ecosystem structure. Further, uncertainties arise from stochastic hydrologic drivers such as flash flood events. Particularly in the face of climate change, the management strategies have to be applicable to a wide range of possible flood regimes, i.e. they have to be robust to the uncertainty of future flood regimes. In this study we assess a variety of management strategies regarding their robustness under different theoretical ecosystems and under uncertainty in the future stochastic flood regimes along the Kuiseb River. We consider the trade-off between ecological and human requirements by investigating the management strategies in terms of their ability to sustainably exploit the ground water resource while preserving the natural vegetation structure (here: coexistence of three tree species). We apply a conceptual ecohydrological model and use the information gap decision theory to estimate the robustness of strategies to failure due to flood parameter uncertainty. The performance of every strategy decreased as flood parameter uncertainty increased. However, ecological performance was more vulnerable with increasing uncertainty than the water supply performance, suggesting that the vegetation structure can be used as sensitive indicator and pre-warning system for changing environmental conditions. With the integrated and adaptive strategy it was most likely to sustainably use the ground water while preserving the natural vegetation structure, however, with the effect of reducing the probability of a large total system biomass.

  5. Modeling the value for money of changing clinical practice change: a stochastic application in diabetes care.

    PubMed

    Hoomans, Ties; Abrams, Keith R; Ament, Andre J H A; Evers, Silvia M A A; Severens, Johan L

    2009-10-01

    Decision making about resource allocation for guideline implementation to change clinical practice is inevitably undertaken in a context of uncertainty surrounding the cost-effectiveness of both clinical guidelines and implementation strategies. Adopting a total net benefit approach, a model was recently developed to overcome problems with the use of combined ratio statistics when analyzing decision uncertainty. To demonstrate the stochastic application of the model for informing decision making about the adoption of an audit and feedback strategy for implementing a guideline recommending intensive blood glucose control in type 2 diabetes in primary care in the Netherlands. An integrated Bayesian approach to decision modeling and evidence synthesis is adopted, using Markov Chain Monte Carlo simulation in WinBUGs. Data on model parameters is gathered from various sources, with effectiveness of implementation being estimated using pooled, random-effects meta-analysis. Decision uncertainty is illustrated using cost-effectiveness acceptability curves and frontier. Decisions about whether to adopt intensified glycemic control and whether to adopt audit and feedback alter for the maximum values that decision makers are willing to pay for health gain. Through simultaneously incorporating uncertain economic evidence on both guidance and implementation strategy, the cost-effectiveness acceptability curves and cost-effectiveness acceptability frontier show an increase in decision uncertainty concerning guideline implementation. The stochastic application in diabetes care demonstrates that the model provides a simple and useful tool for quantifying and exploring the (combined) uncertainty associated with decision making about adopting guidelines and implementation strategies and, therefore, for informing decisions about efficient resource allocation to change clinical practice.

  6. Transportation strategy development under economic uncertainty.

    DOT National Transportation Integrated Search

    2013-05-01

    The interests of the researchers here were to understand various modes for developing long term : that is strategic plans with particular concern for the economic uncertainties one invariably : faces in such a planning environment. Often resou...

  7. A Regional CO2 Observing System Simulation Experiment for the ASCENDS Satellite Mission

    NASA Technical Reports Server (NTRS)

    Wang, J. S.; Kawa, S. R.; Eluszkiewicz, J.; Baker, D. F.; Mountain, M.; Henderson, J.; Nehrkorn, T.; Zaccheo, T. S.

    2014-01-01

    Top-down estimates of the spatiotemporal variations in emissions and uptake of CO2 will benefit from the increasing measurement density brought by recent and future additions to the suite of in situ and remote CO2 measurement platforms. In particular, the planned NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) satellite mission will provide greater coverage in cloudy regions, at high latitudes, and at night than passive satellite systems, as well as high precision and accuracy. In a novel approach to quantifying the ability of satellite column measurements to constrain CO2 fluxes, we use a portable library of footprints (surface influence functions) generated by the WRF-STILT Lagrangian transport model in a regional Bayesian synthesis inversion. The regional Lagrangian framework is well suited to make use of ASCENDS observations to constrain fluxes at high resolution, in this case at 1 degree latitude x 1 degree longitude and weekly for North America. We consider random measurement errors only, modeled as a function of mission and instrument design specifications along with realistic atmospheric and surface conditions. We find that the ASCENDS observations could potentially reduce flux uncertainties substantially at biome and finer scales. At the 1 degree x 1 degree, weekly scale, the largest uncertainty reductions, on the order of 50 percent, occur where and when there is good coverage by observations with low measurement errors and the a priori uncertainties are large. Uncertainty reductions are smaller for a 1.57 micron candidate wavelength than for a 2.05 micron wavelength, and are smaller for the higher of the two measurement error levels that we consider (1.0 ppm vs. 0.5 ppm clear-sky error at Railroad Valley, Nevada). Uncertainty reductions at the annual, biome scale range from 40 percent to 75 percent across our four instrument design cases, and from 65 percent to 85 percent for the continent as a whole. Our uncertainty reductions at various scales are substantially smaller than those from a global ASCENDS inversion on a coarser grid, demonstrating how quantitative results can depend on inversion methodology. The a posteriori flux uncertainties we obtain, ranging from 0.01 to 0.06 Pg C yr-1 across the biomes, would meet requirements for improved understanding of long-term carbon sinks suggested by a previous study.

  8. Rejoinder: Certainty, Doubt, and the Reduction of Uncertainty

    ERIC Educational Resources Information Center

    Kauffman, James M.; Sasso, Gary M.

    2006-01-01

    Postmodern arguments about doubt, certainty, and objectivity are both old and unsound. All philosophical relativity, or postmodernism by whatever name it is known, denies the possibility of objective truth. Postmodernists' arguments for reducing uncertainty or approximating truth are apparently nonexistent, and their method of reducing uncertainty…

  9. REDD+ emissions estimation and reporting: dealing with uncertainty

    NASA Astrophysics Data System (ADS)

    Pelletier, Johanne; Martin, Davy; Potvin, Catherine

    2013-09-01

    The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology used to evaluate reference level and emission reductions would strengthen the credibility of the system by promoting accountability and transparency. To secure conservativeness and deal with uncertainty, we consider the need for further research using real data available to developing countries to test the applicability of conservative discounts including the trend uncertainty and other possible options that would allow real incentives and stimulate improvements over time. Finally, we argue that REDD+ result-based actions assessed on the basis of a dashboard of performance indicators, not only in ‘tonnes CO2 equ. per year’ might provide a more holistic approach, at least until better accuracy and certainty of forest carbon stocks emission and removal estimates to support a REDD+ policy can be reached.

  10. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  11. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  12. Dairy food products: good or bad for cardiometabolic disease?

    PubMed

    Lovegrove, Julie A; Givens, D Ian

    2016-12-01

    Prevalence of type 2 diabetes mellitus (T2DM) is rapidly increasingly and is a key risk for CVD development, now recognised as the leading cause of death globally. Dietary strategies to reduce CVD development include reduction of saturated fat intake. Milk and dairy products are the largest contributors to dietary saturated fats in the UK and reduced consumption is often recommended as a strategy for risk reduction. However, overall evidence from prospective cohort studies does not confirm a detrimental association between dairy product consumption and CVD risk. The present review critically evaluates the current evidence on the association between milk and dairy products and risk of CVD, T2DM and the metabolic syndrome (collectively, cardiometabolic disease). The effects of total and individual dairy foods on cardiometabolic risk factors and new information on the effects of the food matrix on reducing fat digestion are also reviewed. It is concluded that a policy to lower SFA intake by reducing dairy food consumption to reduce cardiometabolic disease risk is likely to have limited or possibly negative effects. There remain many uncertainties, including differential effects of different dairy products and those of differing fat content. Focused and suitably designed and powered studies are needed to provide clearer evidence not only of the mechanisms involved, but how they may be beneficially influenced during milk production and processing.

  13. Robust Derivation of Risk Reduction Strategies

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Port, Daniel; Feather, Martin

    2007-01-01

    Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.

  14. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  15. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  16. Per-pack price reductions available from different cigarette purchasing strategies: United States, 2009-2010.

    PubMed

    Pesko, Michael F; Xu, Xin; Tynan, Michael A; Gerzoff, Robert B; Malarcher, Ann M; Pechacek, Terry F

    2014-06-01

    Following cigarette excise tax increases, smokers may use cigarette price minimization strategies to continue their usual cigarette consumption rather than reducing consumption or quitting. This reduces the public health benefits of the tax increase. This paper estimates the price reductions for a wide-range of strategies, compensating for overlapping strategies. We performed regression analysis on the 2009-2010 National Adult Tobacco Survey (N=13,394) to explore price reductions that smokers in the United States obtained from purchasing cigarettes. We examined five cigarette price minimization strategies: 1) purchasing discount brand cigarettes, 2) using price promotions, 3) purchasing cartons, 4) purchasing on Indian reservations, and 5) purchasing online. Price reductions from these strategies were estimated jointly to compensate for overlapping strategies. Each strategy provided price reductions between 26 and 99cents per pack. Combined price reductions were possible. Additionally, price promotions were used with regular brands to obtain larger price reductions than when price promotions were used with generic brands. Smokers can realize large price reductions from price minimization strategies, and there are many strategies available. Policymakers and public health officials should be aware of the extent that these strategies can reduce cigarette prices. Published by Elsevier Inc.

  17. Mental Health Experiences of Older Adults Living with HIV: Uncertainty, Stigma, and Approaches to Resilience.

    PubMed

    Furlotte, Charles; Schwartz, Karen

    2017-06-01

    This study describes the mental health experiences of older adults living with HIV in Ottawa. Eleven participants aged 52 to 67 completed in-depth personal interviews. Mental health concerns pervaded the lives of these older adults. We identified three central themes common to the participants' stories: uncertainty, stigma, and resilience. For some of these participants, uncertainty impacting mental health centred on unexpected survival; interpretation of one's symptoms; and medical uncertainty. Participants' experiences of stigma included discrimination in health care interactions; misinformation; feeling stigmatized due to aspects of their physical appearance; compounded stigma; and anticipated stigma. Participants reported using several coping strategies, which we frame as individual approaches to resilience. These strategies include reducing the space that HIV takes up in one's life; making lifestyle changes to accommodate one's illness; and engaging with social support. These findings inform understandings of services for people aging with HIV who may experience mental health concerns.

  18. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  19. Defining and Measuring Diagnostic Uncertainty in Medicine: A Systematic Review.

    PubMed

    Bhise, Viraj; Rajan, Suja S; Sittig, Dean F; Morgan, Robert O; Chaudhary, Pooja; Singh, Hardeep

    2018-01-01

    Physicians routinely encounter diagnostic uncertainty in practice. Despite its impact on health care utilization, costs and error, measurement of diagnostic uncertainty is poorly understood. We conducted a systematic review to describe how diagnostic uncertainty is defined and measured in medical practice. We searched OVID Medline and PsycINFO databases from inception until May 2017 using a combination of keywords and Medical Subject Headings (MeSH). Additional search strategies included manual review of references identified in the primary search, use of a topic-specific database (AHRQ-PSNet) and expert input. We specifically focused on articles that (1) defined diagnostic uncertainty; (2) conceptualized diagnostic uncertainty in terms of its sources, complexity of its attributes or strategies for managing it; or (3) attempted to measure diagnostic uncertainty. We identified 123 articles for full review, none of which defined diagnostic uncertainty. Three attributes of diagnostic uncertainty were relevant for measurement: (1) it is a subjective perception experienced by the clinician; (2) it has the potential to impact diagnostic evaluation-for example, when inappropriately managed, it can lead to diagnostic delays; and (3) it is dynamic in nature, changing with time. Current methods for measuring diagnostic uncertainty in medical practice include: (1) asking clinicians about their perception of uncertainty (surveys and qualitative interviews), (2) evaluating the patient-clinician encounter (such as by reviews of medical records, transcripts of patient-clinician communication and observation), and (3) experimental techniques (patient vignette studies). The term "diagnostic uncertainty" lacks a clear definition, and there is no comprehensive framework for its measurement in medical practice. Based on review findings, we propose that diagnostic uncertainty be defined as a "subjective perception of an inability to provide an accurate explanation of the patient's health problem." Methodological advancements in measuring diagnostic uncertainty can improve our understanding of diagnostic decision-making and inform interventions to reduce diagnostic errors and overuse of health care resources.

  20. Reducing uncertainty in estimating virus reduction by advanced water treatment processes.

    PubMed

    Gerba, Charles P; Betancourt, Walter Q; Kitajima, Masaaki; Rock, Channah M

    2018-04-15

    Treatment of wastewater for potable reuse requires the reduction of enteric viruses to levels that pose no significant risk to human health. Advanced water treatment trains (e.g., chemical clarification, reverse osmosis, ultrafiltration, advanced oxidation) have been developed to provide reductions of viruses to differing levels of regulatory control depending upon the levels of human exposure and associated health risks. Importance in any assessment is information on the concentration and types of viruses in the untreated wastewater, as well as the degree of removal by each treatment process. However, it is critical that the uncertainty associated with virus concentration and removal or inactivation by wastewater treatment be understood to improve these estimates and identifying research needs. We reviewed the critically literature to assess to identify uncertainty in these estimates. Biological diversity within families and genera of viruses (e.g. enteroviruses, rotaviruses, adenoviruses, reoviruses, noroviruses) and specific virus types (e.g. serotypes or genotypes) creates the greatest uncertainty. These aspects affect the methods for detection and quantification of viruses and anticipated removal efficiency by treatment processes. Approaches to reduce uncertainty may include; 1) inclusion of a virus indicator for assessing efficiency of virus concentration and detection by molecular methods for each sample, 2) use of viruses most resistant to individual treatment processes (e.g. adenoviruses for UV light disinfection and reoviruses for chlorination), 3) data on ratio of virion or genome copies to infectivity in untreated wastewater, and 4) assessment of virus removal at field scale treatment systems to verify laboratory and pilot plant data for virus removal. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  2. Decision analysis of emergency ventilation and evacuation strategies against suddenly released contaminant indoors by considering the uncertainty of source locations.

    PubMed

    Cai, Hao; Long, Weiding; Li, Xianting; Kong, Lingjuan; Xiong, Shuang

    2010-06-15

    In case hazardous contaminants are suddenly released indoors, the prompt and proper emergency responses are critical to protect occupants. This paper aims to provide a framework for determining the optimal combination of ventilation and evacuation strategies by considering the uncertainty of source locations. The certainty of source locations is classified as complete certainty, incomplete certainty, and complete uncertainty to cover all the possible situations. According to this classification, three types of decision analysis models are presented. A new concept, efficiency factor of contaminant source (EFCS), is incorporated in these models to evaluate the payoffs of the ventilation and evacuation strategies. A procedure of decision-making based on these models is proposed and demonstrated by numerical studies of one hundred scenarios with ten ventilation modes, two evacuation modes, and five source locations. The results show that the models can be useful to direct the decision analysis of both the ventilation and evacuation strategies. In addition, the certainty of the source locations has an important effect on the outcomes of the decision-making. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Sleep deprivation alters choice strategy without altering uncertainty or loss aversion preferences

    PubMed Central

    Mullette-Gillman, O'Dhaniel A.; Kurnianingsih, Yoanna A.; Liu, Jean C. J.

    2015-01-01

    Sleep deprivation alters decision making; however, it is unclear what specific cognitive processes are modified to drive altered choices. In this manuscript, we examined how one night of total sleep deprivation (TSD) alters economic decision making. We specifically examined changes in uncertainty preferences dissociably from changes in the strategy with which participants engage with presented choice information. With high test-retest reliability, we show that TSD does not alter uncertainty preferences or loss aversion. Rather, TSD alters the information the participants rely upon to make their choices. Utilizing a choice strategy metric which contrasts the influence of maximizing and satisficing information on choice behavior, we find that TSD alters the relative reliance on maximizing information and satisficing information, in the gains domain. This alteration is the result of participants both decreasing their reliance on cognitively-complex maximizing information and a concomitant increase in the use of readily-available satisficing information. TSD did not result in a decrease in overall information use in either domain. These results show that sleep deprivation alters decision making by altering the informational strategies that participants employ, without altering their preferences. PMID:26500479

  4. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  5. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  6. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. A vehicle stability control strategy with adaptive neural network sliding mode theory based on system uncertainty approximation

    NASA Astrophysics Data System (ADS)

    Ji, Xuewu; He, Xiangkun; Lv, Chen; Liu, Yahui; Wu, Jian

    2018-06-01

    Modelling uncertainty, parameter variation and unknown external disturbance are the major concerns in the development of an advanced controller for vehicle stability at the limits of handling. Sliding mode control (SMC) method has proved to be robust against parameter variation and unknown external disturbance with satisfactory tracking performance. But modelling uncertainty, such as errors caused in model simplification, is inevitable in model-based controller design, resulting in lowered control quality. The adaptive radial basis function network (ARBFN) can effectively improve the control performance against large system uncertainty by learning to approximate arbitrary nonlinear functions and ensure the global asymptotic stability of the closed-loop system. In this paper, a novel vehicle dynamics stability control strategy is proposed using the adaptive radial basis function network sliding mode control (ARBFN-SMC) to learn system uncertainty and eliminate its adverse effects. This strategy adopts a hierarchical control structure which consists of reference model layer, yaw moment control layer, braking torque allocation layer and executive layer. Co-simulation using MATLAB/Simulink and AMESim is conducted on a verified 15-DOF nonlinear vehicle system model with the integrated-electro-hydraulic brake system (I-EHB) actuator in a Sine With Dwell manoeuvre. The simulation results show that ARBFN-SMC scheme exhibits superior stability and tracking performance in different running conditions compared with SMC scheme.

  8. Understanding climate policy data needs

    NASA Astrophysics Data System (ADS)

    Brown, Molly E.; Macauley, Molly

    2012-08-01

    NASA Carbon Monitoring System: Characterizing Flux Uncertainty; Washington, D. C, 11 January 2012 Climate policy in the United States is currently guided by public-private partnerships and actions at the local and state levels that focus on energy efficiency, renewable energy, agricultural practices, and implementation of technologies to reduce greenhouse gases. How will policy makers know if these strategies are working, particularly at the scales at which they are being implemented? The NASA Carbon Monitoring System (CMS) will provide information on carbon dioxide (CO2) fluxes derived from observations of Earth's land, ocean, and atmosphere used in state-of-the-art models describing their interactions. This new modeling system could be used to assess the impact of specific policy interventions on reductions of atmospheric CO2 concentrations, enabling an iterative, results-oriented policy process.

  9. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  10. Beyond optimality: Multistakeholder robustness tradeoffs for regional water portfolio planning under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.

    2014-10-01

    While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results suggest that a modest reduction in the projected rate of demand growth (from approximately 3% per year to 2.4%) will substantially improve the utilities' robustness to future uncertainty and reduce the potential for regional tensions. The proposed multistakeholder MORDM framework offers critical insights into the risks and challenges posed by rising water demands and hydrological uncertainties, providing a planning template for regions now forced to confront rapidly evolving water scarcity risks.

  11. Incorporating uncertainty into the ranking of SPARROW model nutrient yields from Mississippi/Atchafalaya River basin watersheds

    USGS Publications Warehouse

    Robertson, Dale M.; Schwarz, Gregory E.; Saad, David A.; Alexander, Richard B.

    2009-01-01

    Excessive loads of nutrients transported by tributary rivers have been linked to hypoxia in the Gulf of Mexico. Management efforts to reduce the hypoxic zone in the Gulf of Mexico and improve the water quality of rivers and streams could benefit from targeting nutrient reductions toward watersheds with the highest nutrient yields delivered to sensitive downstream waters. One challenge is that most conventional watershed modeling approaches (e.g., mechanistic models) used in these management decisions do not consider uncertainties in the predictions of nutrient yields and their downstream delivery. The increasing use of parameter estimation procedures to statistically estimate model coefficients, however, allows uncertainties in these predictions to be reliably estimated. Here, we use a robust bootstrapping procedure applied to the results of a previous application of the hybrid statistical/mechanistic watershed model SPARROW (Spatially Referenced Regression On Watershed attributes) to develop a statistically reliable method for identifying “high priority” areas for management, based on a probabilistic ranking of delivered nutrient yields from watersheds throughout a basin. The method is designed to be used by managers to prioritize watersheds where additional stream monitoring and evaluations of nutrient-reduction strategies could be undertaken. Our ranking procedure incorporates information on the confidence intervals of model predictions and the corresponding watershed rankings of the delivered nutrient yields. From this quantified uncertainty, we estimate the probability that individual watersheds are among a collection of watersheds that have the highest delivered nutrient yields. We illustrate the application of the procedure to 818 eight-digit Hydrologic Unit Code watersheds in the Mississippi/Atchafalaya River basin by identifying 150 watersheds having the highest delivered nutrient yields to the Gulf of Mexico. Highest delivered yields were from watersheds in the Central Mississippi, Ohio, and Lower Mississippi River basins. With 90% confidence, only a few watersheds can be reliably placed into the highest 150 category; however, many more watersheds can be removed from consideration as not belonging to the highest 150 category. Results from this ranking procedure provide robust information on watershed nutrient yields that can benefit management efforts to reduce nutrient loadings to downstream coastal waters, such as the Gulf of Mexico, or to local receiving streams and reservoirs.

  12. Fate of organic microcontaminants in wastewater treatment and river systems: An uncertainty assessment in view of sampling strategy, and compound consumption rate and degradability.

    PubMed

    Aymerich, I; Acuña, V; Ort, C; Rodríguez-Roda, I; Corominas, Ll

    2017-11-15

    The growing awareness of the relevance of organic microcontaminants on the environment has led to a growing number of studies on attenuation of these compounds in wastewater treatment plants (WWTP) and rivers. However, the effects of the sampling strategies (frequency and duration of composite samples) on the attenuation estimates are largely unknown. Our goal was to assess how frequency and duration of composite samples influence uncertainty of the attenuation estimates in WWTPs and rivers. Furthermore, we also assessed how compound consumption rate and degradability influence uncertainty. The assessment was conducted through simulating the integrated wastewater system of Puigcerdà (NE Iberian Peninsula) using a sewer pattern generator and a coupled model of WWTP and river. Results showed that the sampling strategy is especially critical at the influent of WWTP, particularly when the number of toilet flushes containing the compound of interest is small (≤100 toilet flushes with compound day -1 ), and less critical at the effluent of the WWTP and in the river due to the mixing effects of the WWTP. For example, at the WWTP, when evaluating a compound that is present in 50 pulses·d -1 using a sampling frequency of 15-min to collect a 24-h composite sample, the attenuation uncertainty can range from 94% (0% degradability) to 9% (90% degradability). The estimation of attenuation in rivers is less critical than in WWTPs, as the attenuation uncertainty was lower than 10% for all evaluated scenarios. Interestingly, the errors in the estimates of attenuation are usually lower than those of loads for most sampling strategies and compound characteristics (e.g. consumption and degradability), although the opposite occurs for compounds with low consumption and inappropriate sampling strategies at the WWTP. Hence, when designing a sampling campaign, one should consider the influence of compounds' consumption and degradability as well as the desired level of accuracy in attenuation estimations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Evaluating vaccination strategies to control foot-and-mouth disease: a model comparison study.

    PubMed

    Roche, S E; Garner, M G; Sanson, R L; Cook, C; Birch, C; Backer, J A; Dube, C; Patyk, K A; Stevenson, M A; Yu, Z D; Rawdon, T G; Gauntlett, F

    2015-04-01

    Simulation models can offer valuable insights into the effectiveness of different control strategies and act as important decision support tools when comparing and evaluating outbreak scenarios and control strategies. An international modelling study was performed to compare a range of vaccination strategies in the control of foot-and-mouth disease (FMD). Modelling groups from five countries (Australia, New Zealand, USA, UK, The Netherlands) participated in the study. Vaccination is increasingly being recognized as a potentially important tool in the control of FMD, although there is considerable uncertainty as to how and when it should be used. We sought to compare model outputs and assess the effectiveness of different vaccination strategies in the control of FMD. Using a standardized outbreak scenario based on data from an FMD exercise in the UK in 2010, the study showed general agreement between respective models in terms of the effectiveness of vaccination. Under the scenario assumptions, all models demonstrated that vaccination with 'stamping-out' of infected premises led to a significant reduction in predicted epidemic size and duration compared to the 'stamping-out' strategy alone. For all models there were advantages in vaccinating cattle-only rather than all species, using 3-km vaccination rings immediately around infected premises, and starting vaccination earlier in the control programme. This study has shown that certain vaccination strategies are robust even to substantial differences in model configurations. This result should increase end-user confidence in conclusions drawn from model outputs. These results can be used to support and develop effective policies for FMD control.

  14. Current and future role of genetic screening in gynecologic malignancies.

    PubMed

    Ring, Kari L; Garcia, Christine; Thomas, Martha H; Modesitt, Susan C

    2017-11-01

    The world of hereditary cancers has seen exponential growth in recent years. While hereditary breast and ovarian cancer and Lynch syndrome account for the majority of mutations encountered by gynecologists, newly identified deleterious genetic mutations continue to be unearthed with their associated risks of malignancies. However, these advances in genetic cancer predispositions then force practitioners and their patients to confront the uncertainties of these less commonly identified mutations and the fact that there is limited evidence to guide them in expected cancer risk and appropriate risk-reduction strategies. Given the speed of information, it is imperative to involve cancer genetics experts when counseling these patients. In addition, coordination of screening and care in conjunction with specialty high-risk clinics, if available, allows for patients to have centralized management for multiple cancer risks under the guidance of physicians with experience counseling these patients. The objective of this review is to present the current literature regarding genetic mutations associated with gynecologic malignancies as well to propose screening and risk-reduction options for these high-risk patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Menu labeling as a potential strategy for combating the obesity epidemic: a health impact assessment.

    PubMed

    Kuo, Tony; Jarosz, Christopher J; Simon, Paul; Fielding, Jonathan E

    2009-09-01

    We conducted a health impact assessment to quantify the potential impact of a state menu-labeling law on population weight gain in Los Angeles County, California. We utilized published and unpublished data to model consumer response to point-of-purchase calorie postings at large chain restaurants in Los Angeles County. We conducted sensitivity analyses to account for uncertainty in consumer response and in the total annual revenue, market share, and average meal price of large chain restaurants in the county. Assuming that 10% of the restaurant patrons would order reduced-calorie meals in response to calorie postings, resulting in an average reduction of 100 calories per meal, we estimated that menu labeling would avert 40.6% of the 6.75 million pound average annual weight gain in the county population aged 5 years and older. Substantially larger impacts would be realized if higher percentages of patrons ordered reduced-calorie meals or if average per-meal calorie reductions increased. Our findings suggest that mandated menu labeling could have a sizable salutary impact on the obesity epidemic, even with only modest changes in consumer behavior.

  16. Local setup errors in image-guided radiotherapy for head and neck cancer patients immobilized with a custom-made device.

    PubMed

    Giske, Kristina; Stoiber, Eva M; Schwarz, Michael; Stoll, Armin; Muenter, Marc W; Timke, Carmen; Roeder, Falk; Debus, Juergen; Huber, Peter E; Thieke, Christian; Bendl, Rolf

    2011-06-01

    To evaluate the local positioning uncertainties during fractionated radiotherapy of head-and-neck cancer patients immobilized using a custom-made fixation device and discuss the effect of possible patient correction strategies for these uncertainties. A total of 45 head-and-neck patients underwent regular control computed tomography scanning using an in-room computed tomography scanner. The local and global positioning variations of all patients were evaluated by applying a rigid registration algorithm. One bounding box around the complete target volume and nine local registration boxes containing relevant anatomic structures were introduced. The resulting uncertainties for a stereotactic setup and the deformations referenced to one anatomic local registration box were determined. Local deformations of the patients immobilized using our custom-made device were compared with previously published results. Several patient positioning correction strategies were simulated, and the residual local uncertainties were calculated. The patient anatomy in the stereotactic setup showed local systematic positioning deviations of 1-4 mm. The deformations referenced to a particular anatomic local registration box were similar to the reported deformations assessed from patients immobilized with commercially available Aquaplast masks. A global correction, including the rotational error compensation, decreased the remaining local translational errors. Depending on the chosen patient positioning strategy, the remaining local uncertainties varied considerably. Local deformations in head-and-neck patients occur even if an elaborate, custom-made patient fixation method is used. A rotational error correction decreased the required margins considerably. None of the considered correction strategies achieved perfect alignment. Therefore, weighting of anatomic subregions to obtain the optimal correction vector should be investigated in the future. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Per-pack price reductions available from different cigarette purchasing strategies: United States, 2009–2010☆

    PubMed Central

    Pesko, Michael F.; Xu, Xin; Tynan, Michael A.; Gerzoff, Robert B.; Malarcher, Ann M.; Pechacek, Terry F.

    2015-01-01

    Objective Following cigarette excise tax increases, smokers may use cigarette price minimization strategies to continue their usual cigarette consumption rather than reducing consumption or quitting. This reduces the public health benefits of the tax increase. This paper estimates the price reductions for a wide-range of strategies, compensating for overlapping strategies. Method We performed regression analysis on the 2009–2010 National Adult Tobacco Survey (N = 13,394) to explore price reductions that smokers in the United States obtained from purchasing cigarettes. We examined five cigarette price minimization strategies: 1) purchasing discount brand cigarettes, 2) using price promotions, 3) purchasing cartons, 4) purchasing on Indian reservations, and 5) purchasing online. Price reductions from these strategies were estimated jointly to compensate for overlapping strategies. Results Each strategy provided price reductions between 26 and 99 cents per pack. Combined price reductions were possible. Additionally, price promotions were used with regular brands to obtain larger price reductions than when price promotions were used with generic brands. Conclusion Smokers can realize large price reductions from price minimization strategies, and there are many strategies available. Policymakers and public health officials should be aware of the extent that these strategies can reduce cigarette prices. PMID:24594102

  18. "Maybe the Algae Was from the Filter": Maybe and Similar Modifiers as Mediational Tools and Indicators of Uncertainty and Possibility in Children's Science Talk

    ERIC Educational Resources Information Center

    Kirch, Susan A.; Siry, Christina A.

    2012-01-01

    Uncertainty is an essential component of scientific inquiry and it also permeates our daily lives. Understanding how to identify, evaluate, resolve and live in the presence of uncertainty is important for decision-making strategies and engaging in transformative actions. In contrast, confidence and certainty are prized in elementary school…

  19. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    Treesearch

    Harbin Li; Steven G. McNulty

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...

  20. A geostatistical approach for quantification of contaminant mass discharge uncertainty using multilevel sampler measurements

    NASA Astrophysics Data System (ADS)

    Li, K. Betty; Goovaerts, Pierre; Abriola, Linda M.

    2007-06-01

    Contaminant mass discharge across a control plane downstream of a dense nonaqueous phase liquid (DNAPL) source zone has great potential to serve as a metric for the assessment of the effectiveness of source zone treatment technologies and for the development of risk-based source-plume remediation strategies. However, too often the uncertainty of mass discharge estimated in the field is not accounted for in the analysis. In this paper, a geostatistical approach is proposed to estimate mass discharge and to quantify its associated uncertainty using multilevel transect measurements of contaminant concentration (C) and hydraulic conductivity (K). The approach adapts the p-field simulation algorithm to propagate and upscale the uncertainty of mass discharge from the local uncertainty models of C and K. Application of this methodology to numerically simulated transects shows that, with a regular sampling pattern, geostatistics can provide an accurate model of uncertainty for the transects that are associated with low levels of source mass removal (i.e., transects that have a large percentage of contaminated area). For high levels of mass removal (i.e., transects with a few hot spots and large areas of near-zero concentration), a total sampling area equivalent to 6˜7% of the transect is required to achieve accurate uncertainty modeling. A comparison of the results for different measurement supports indicates that samples taken with longer screen lengths may lead to less accurate models of mass discharge uncertainty. The quantification of mass discharge uncertainty, in the form of a probability distribution, will facilitate risk assessment associated with various remediation strategies.

  1. Aging and loss decision making: increased risk aversion and decreased use of maximizing information, with correlated rationality and value maximization

    PubMed Central

    Kurnianingsih, Yoanna A.; Sim, Sam K. Y.; Chee, Michael W. L.; Mullette-Gillman, O’Dhaniel A.

    2015-01-01

    We investigated how adult aging specifically alters economic decision-making, focusing on examining alterations in uncertainty preferences (willingness to gamble) and choice strategies (what gamble information influences choices) within both the gains and losses domains. Within each domain, participants chose between certain monetary outcomes and gambles with uncertain outcomes. We examined preferences by quantifying how uncertainty modulates choice behavior as if altering the subjective valuation of gambles. We explored age-related preferences for two types of uncertainty, risk, and ambiguity. Additionally, we explored how aging may alter what information participants utilize to make their choices by comparing the relative utilization of maximizing and satisficing information types through a choice strategy metric. Maximizing information was the ratio of the expected value of the two options, while satisficing information was the probability of winning. We found age-related alterations of economic preferences within the losses domain, but no alterations within the gains domain. Older adults (OA; 61–80 years old) were significantly more uncertainty averse for both risky and ambiguous choices. OA also exhibited choice strategies with decreased use of maximizing information. Within OA, we found a significant correlation between risk preferences and choice strategy. This linkage between preferences and strategy appears to derive from a convergence to risk neutrality driven by greater use of the effortful maximizing strategy. As utility maximization and value maximization intersect at risk neutrality, this result suggests that OA are exhibiting a relationship between enhanced rationality and enhanced value maximization. While there was variability in economic decision-making measures within OA, these individual differences were unrelated to variability within examined measures of cognitive ability. Our results demonstrate that aging alters economic decision-making for losses through changes in both individual preferences and the strategies individuals employ. PMID:26029092

  2. A case study of view-factor rectification procedures for diffuse-gray radiation enclosure computations

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio

    1995-01-01

    The view factors which are used in diffuse-gray radiation enclosure calculations are often computed by approximate numerical integrations. These approximately calculated view factors will usually not satisfy the important physical constraints of reciprocity and closure. In this paper several view-factor rectification algorithms are reviewed and a rectification algorithm based on a least-squares numerical filtering scheme is proposed with both weighted and unweighted classes. A Monte-Carlo investigation is undertaken to study the propagation of view-factor and surface-area uncertainties into the heat transfer results of the diffuse-gray enclosure calculations. It is found that the weighted least-squares algorithm is vastly superior to the other rectification schemes for the reduction of the heat-flux sensitivities to view-factor uncertainties. In a sample problem, which has proven to be very sensitive to uncertainties in view factor, the heat transfer calculations with weighted least-squares rectified view factors are very good with an original view-factor matrix computed to only one-digit accuracy. All of the algorithms had roughly equivalent effects on the reduction in sensitivity to area uncertainty in this case study.

  3. Active Subspaces for Wind Plant Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Ryan N; Quick, Julian; Dykes, Katherine L

    Understanding the uncertainty in wind plant performance is crucial to their cost-effective design and operation. However, conventional approaches to uncertainty quantification (UQ), such as Monte Carlo techniques or surrogate modeling, are often computationally intractable for utility-scale wind plants because of poor congergence rates or the curse of dimensionality. In this paper we demonstrate that wind plant power uncertainty can be well represented with a low-dimensional active subspace, thereby achieving a significant reduction in the dimension of the surrogate modeling problem. We apply the active sub-spaces technique to UQ of plant power output with respect to uncertainty in turbine axial inductionmore » factors, and find a single active subspace direction dominates the sensitivity in power output. When this single active subspace direction is used to construct a quadratic surrogate model, the number of model unknowns can be reduced by up to 3 orders of magnitude without compromising performance on unseen test data. We conclude that the dimension reduction achieved with active subspaces makes surrogate-based UQ approaches tractable for utility-scale wind plants.« less

  4. Practical uncertainty reduction and quantification in shock physics measurements

    DOE PAGES

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  5. Socializing Identity Through Practice: A Mixed Methods Approach to Family Medicine Resident Perspectives on Uncertainty.

    PubMed

    Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A

    2015-01-01

    Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.

  6. Which complexity of regional climate system models is essential for downscaling anthropogenic climate change in the Northwest European Shelf?

    NASA Astrophysics Data System (ADS)

    Mathis, Moritz; Elizalde, Alberto; Mikolajewicz, Uwe

    2018-04-01

    Climate change impact studies for the Northwest European Shelf (NWES) make use of various dynamical downscaling strategies in the experimental setup of regional ocean circulation models. Projected change signals from coupled and uncoupled downscalings with different domain sizes and forcing global and regional models show substantial uncertainty. In this paper, we investigate influences of the downscaling strategy on projected changes in the physical and biogeochemical conditions of the NWES. Our results indicate that uncertainties due to different downscaling strategies are similar to uncertainties due to the choice of the parent global model and the downscaling regional model. Downscaled change signals reveal to depend stronger on the downscaling strategy than on the model skills in simulating present-day conditions. Uncoupled downscalings of sea surface temperature (SST) changes are found to be tightly constrained by the atmospheric forcing. The incorporation of coupled air-sea interaction, by contrast, allows the regional model system to develop independently. Changes in salinity show a higher sensitivity to open lateral boundary conditions and river runoff than to coupled or uncoupled atmospheric forcings. Dependencies on the downscaling strategy for changes in SST, salinity, stratification and circulation collectively affect changes in nutrient import and biological primary production.

  7. Integrating geological uncertainty in long-term open pit mine production planning by ant colony optimization

    NASA Astrophysics Data System (ADS)

    Gilani, Seyed-Omid; Sattarvand, Javad

    2016-02-01

    Meeting production targets in terms of ore quantity and quality is critical for a successful mining operation. In-situ grade uncertainty causes both deviations from production targets and general financial deficits. A new stochastic optimization algorithm based on ant colony optimization (ACO) approach is developed herein to integrate geological uncertainty described through a series of the simulated ore bodies. Two different strategies were developed based on a single predefined probability value (Prob) and multiple probability values (Pro bnt) , respectively in order to improve the initial solutions that created by deterministic ACO procedure. Application at the Sungun copper mine in the northwest of Iran demonstrate the abilities of the stochastic approach to create a single schedule and control the risk of deviating from production targets over time and also increase the project value. A comparison between two strategies and traditional approach illustrates that the multiple probability strategy is able to produce better schedules, however, the single predefined probability is more practical in projects requiring of high flexibility degree.

  8. Adaptive strategies for materials design using uncertainties

    DOE PAGES

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; ...

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  9. A model for medical decision making and problem solving.

    PubMed

    Werner, M

    1995-08-01

    Clinicians confront the classical problem of decision making under uncertainty, but a universal procedure by which they deal with this situation, both in diagnosis and therapy, can be defined. This consists in the choice of a specific course of action from available alternatives so as to reduce uncertainty. Formal analysis evidences that the expected value of this process depends on the a priori probabilities confronted, the discriminatory power of the action chosen, and the values and costs associated with possible outcomes. Clinical problem-solving represents the construction of a systematic strategy from multiple decisional building blocks. Depending on the level of uncertainty the physicians attach to their working hypothesis, they can choose among at least four prototype strategies: pattern recognition, the hypothetico-deductive process, arborization, and exhaustion. However, the resolution of real-life problems can involve a combination of these game plans. Formal analysis of each strategy permits definition of its appropriate a priori probabilities, action characteristics, and cost implications.

  10. Adaptive strategies for materials design using uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  11. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    PubMed

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  12. Giving sustainable agriculture really good odds through innovative rainfall index insurance

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, C. P.; Muneepeerakul, R.

    2017-12-01

    Population growth, increasing demands for food, and increasingly uncertain and limited water availability amidst competing demands for water by other users and the environment call for a novel approach to manage water in food production systems to be developed now. Tapping into broad popularity of crop insurance as a risk management intervention, we propose an innovative rainfall index insurance program as a novel systems approach that addresses water conservation in food production systems by exploiting two common currencies that tie the food production systems and others together, namely water and money. Our novel methodology allows for optimizing diverse farm and financial strategies together, revealing strategy portfolios that result in greater water use efficiency and higher incomes at a lower level of water use. Furthermore, it allows targeted interventions to achieve reduction in irrigation water, while providing financial protection to farmers against the increasing uncertainty in water availability. Not only would such a tool result in efficiently less use of water, it would also encourage diversification in farm practices, which reduces the farm's vulnerability against crop price volatility and pest or disease outbreaks and contributes to more sustainable agriculture.

  13. Estimates of CO2 fluxes over the city of Cape Town, South Africa, through Bayesian inverse modelling

    NASA Astrophysics Data System (ADS)

    Nickless, Alecia; Rayner, Peter J.; Engelbrecht, Francois; Brunke, Ernst-Günther; Erni, Birgit; Scholes, Robert J.

    2018-04-01

    We present a city-scale inversion over Cape Town, South Africa. Measurement sites for atmospheric CO2 concentrations were installed at Robben Island and Hangklip lighthouses, located downwind and upwind of the metropolis. Prior estimates of the fossil fuel fluxes were obtained from a bespoke inventory analysis where emissions were spatially and temporally disaggregated and uncertainty estimates determined by means of error propagation techniques. Net ecosystem exchange (NEE) fluxes from biogenic processes were obtained from the land atmosphere exchange model CABLE (Community Atmosphere Biosphere Land Exchange). Uncertainty estimates were based on the estimates of net primary productivity. CABLE was dynamically coupled to the regional climate model CCAM (Conformal Cubic Atmospheric Model), which provided the climate inputs required to drive the Lagrangian particle dispersion model. The Bayesian inversion framework included a control vector where fossil fuel and NEE fluxes were solved for separately.Due to the large prior uncertainty prescribed to the NEE fluxes, the current inversion framework was unable to adequately distinguish between the fossil fuel and NEE fluxes, but the inversion was able to obtain improved estimates of the total fluxes within pixels and across the domain. The median of the uncertainty reductions of the total weekly flux estimates for the inversion domain of Cape Town was 28 %, but reach as high as 50 %. At the pixel level, uncertainty reductions of the total weekly flux reached up to 98 %, but these large uncertainty reductions were for NEE-dominated pixels. Improved corrections to the fossil fuel fluxes would be possible if the uncertainty around the prior NEE fluxes could be reduced. In order for this inversion framework to be operationalised for monitoring, reporting, and verification (MRV) of emissions from Cape Town, the NEE component of the CO2 budget needs to be better understood. Additional measurements of Δ14C and δ13C isotope measurements would be a beneficial component of an atmospheric monitoring programme aimed at MRV of CO2 for any city which has significant biogenic influence, allowing improved separation of contributions from NEE and fossil fuel fluxes to the observed CO2 concentration.

  14. Cost-effectiveness of a new urinary biomarker-based risk score compared to standard of care in prostate cancer diagnostics - a decision analytical model.

    PubMed

    Dijkstra, Siebren; Govers, Tim M; Hendriks, Rianne J; Schalken, Jack A; Van Criekinge, Wim; Van Neste, Leander; Grutters, Janneke P C; Sedelaar, John P Michiel; van Oort, Inge M

    2017-11-01

    To assess the cost-effectiveness of a new urinary biomarker-based risk score (SelectMDx; MDxHealth, Inc., Irvine, CA, USA) to identify patients for transrectal ultrasonography (TRUS)-guided biopsy and to compare this with the current standard of care (SOC), using only prostate-specific antigen (PSA) to select for TRUS-guided biopsy. A decision tree and Markov model were developed to evaluate the cost-effectiveness of SelectMDx as a reflex test vs SOC in men with a PSA level of >3 ng/mL. Transition probabilities, utilities and costs were derived from the literature and expert opinion. Cost-effectiveness was expressed in quality-adjusted life years (QALYs) and healthcare costs of both diagnostic strategies, simulating the course of patients over a time horizon representing 18 years. Deterministic sensitivity analyses were performed to address uncertainty in assumptions. A diagnostic strategy including SelectMDx with a cut-off chosen at a sensitivity of 95.7% for high-grade prostate cancer resulted in savings of €128 and a gain of 0.025 QALY per patient compared to the SOC strategy. The sensitivity analyses showed that the disutility assigned to active surveillance had a high impact on the QALYs gained and the disutility attributed to TRUS-guided biopsy only slightly influenced the outcome of the model. Based on the currently available evidence, the reduction of over diagnosis and overtreatment due to the use of the SelectMDx test in men with PSA levels of >3 ng/mL may lead to a reduction in total costs per patient and a gain in QALYs. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  15. Model-Based Analysis of the Role of Biological, Hydrological and Geochemical Factors Affecting Uranium Bioremediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Jiao; Scheibe, Timothy D.; Mahadevan, Radhakrishnan

    2011-01-24

    Uranium contamination is a serious concern at several sites motivating the development of novel treatment strategies such as the Geobacter-mediated reductive immobilization of uranium. However, this bioremediation strategy has not yet been optimized for the sustained uranium removal. While several reactive-transport models have been developed to represent Geobacter-mediated bioremediation of uranium, these models often lack the detailed quantitative description of the microbial process (e.g., biomass build-up in both groundwater and sediments, electron transport system, etc.) and the interaction between biogeochemical and hydrological process. In this study, a novel multi-scale model was developed by integrating our recent model on electron capacitancemore » of Geobacter (Zhao et al., 2010) with a comprehensive simulator of coupled fluid flow, hydrologic transport, heat transfer, and biogeochemical reactions. This mechanistic reactive-transport model accurately reproduces the experimental data for the bioremediation of uranium with acetate amendment. We subsequently performed global sensitivity analysis with the reactive-transport model in order to identify the main sources of prediction uncertainty caused by synergistic effects of biological, geochemical, and hydrological processes. The proposed approach successfully captured significant contributing factors across time and space, thereby improving the structure and parameterization of the comprehensive reactive-transport model. The global sensitivity analysis also provides a potentially useful tool to evaluate uranium bioremediation strategy. The simulations suggest that under difficult environments (e.g., highly contaminated with U(VI) at a high migration rate of solutes), the efficiency of uranium removal can be improved by adding Geobacter species to the contaminated site (bioaugmentation) in conjunction with the addition of electron donor (biostimulation). The simulations also highlight the interactive effect of initial cell concentration and flow rate on U(VI) reduction.« less

  16. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    NASA Astrophysics Data System (ADS)

    Arnold, B. W.; Gardner, P.

    2013-12-01

    Calibration of groundwater flow models for the purpose of evaluating flow and aquifer heterogeneity typically uses observations of hydraulic head in wells and appropriate boundary conditions. Environmental tracers have a wide variety of decay rates and input signals in recharge, resulting in a potentially broad source of additional information to constrain flow rates and heterogeneity. A numerical study was conducted to evaluate the reduction in uncertainty during model calibration using observations of various environmental tracers and combinations of tracers. A synthetic data set was constructed by simulating steady groundwater flow and transient tracer transport in a high-resolution, 2-D aquifer with heterogeneous permeability and porosity using the PFLOTRAN software code. Data on pressure and tracer concentration were extracted at well locations and then used as observations for automated calibration of a flow and transport model using the pilot point method and the PEST code. Optimization runs were performed to estimate parameter values of permeability at 30 pilot points in the model domain for cases using 42 observations of: 1) pressure, 2) pressure and CFC11 concentrations, 3) pressure and Ar-39 concentrations, and 4) pressure, CFC11, Ar-39, tritium, and He-3 concentrations. Results show significantly lower uncertainty, as indicated by the 95% linear confidence intervals, in permeability values at the pilot points for cases including observations of environmental tracer concentrations. The average linear uncertainty range for permeability at the pilot points using pressure observations alone is 4.6 orders of magnitude, using pressure and CFC11 concentrations is 1.6 orders of magnitude, using pressure and Ar-39 concentrations is 0.9 order of magnitude, and using pressure, CFC11, Ar-39, tritium, and He-3 concentrations is 1.0 order of magnitude. Data on Ar-39 concentrations result in the greatest parameter uncertainty reduction because its half-life of 269 years is similar to the range of transport times (hundreds to thousands of years) in the heterogeneous synthetic aquifer domain. The slightly higher uncertainty range for the case using all of the environmental tracers simultaneously is probably due to structural errors in the model introduced by the pilot point regularization scheme. It is concluded that maximum information and uncertainty reduction for constraining a groundwater flow model is obtained using an environmental tracer whose half-life is well matched to the range of transport times through the groundwater flow system. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Quantifying the Value of Perfect Information in Emergency Vaccination Campaigns.

    PubMed

    Bradbury, Naomi V; Probert, William J M; Shea, Katriona; Runge, Michael C; Fonnesbeck, Christopher J; Keeling, Matt J; Ferrari, Matthew J; Tildesley, Michael J

    2017-02-01

    Foot-and-mouth disease outbreaks in non-endemic countries can lead to large economic costs and livestock losses but the use of vaccination has been contentious, partly due to uncertainty about emergency FMD vaccination. Value of information methods can be applied to disease outbreak problems such as FMD in order to investigate the performance improvement from resolving uncertainties. Here we calculate the expected value of resolving uncertainty about vaccine efficacy, time delay to immunity after vaccination and daily vaccination capacity for a hypothetical FMD outbreak in the UK. If it were possible to resolve all uncertainty prior to the introduction of control, we could expect savings of £55 million in outbreak cost, 221,900 livestock culled and 4.3 days of outbreak duration. All vaccination strategies were found to be preferable to a culling only strategy. However, the optimal vaccination radius was found to be highly dependent upon vaccination capacity for all management objectives. We calculate that by resolving the uncertainty surrounding vaccination capacity we would expect to return over 85% of the above savings, regardless of management objective. It may be possible to resolve uncertainty about daily vaccination capacity before an outbreak, and this would enable decision makers to select the optimal control action via careful contingency planning.

  18. Quantifying the Value of Perfect Information in Emergency Vaccination Campaigns

    PubMed Central

    Probert, William J. M.; Shea, Katriona; Fonnesbeck, Christopher J.; Ferrari, Matthew J.; Tildesley, Michael J.

    2017-01-01

    Foot-and-mouth disease outbreaks in non-endemic countries can lead to large economic costs and livestock losses but the use of vaccination has been contentious, partly due to uncertainty about emergency FMD vaccination. Value of information methods can be applied to disease outbreak problems such as FMD in order to investigate the performance improvement from resolving uncertainties. Here we calculate the expected value of resolving uncertainty about vaccine efficacy, time delay to immunity after vaccination and daily vaccination capacity for a hypothetical FMD outbreak in the UK. If it were possible to resolve all uncertainty prior to the introduction of control, we could expect savings of £55 million in outbreak cost, 221,900 livestock culled and 4.3 days of outbreak duration. All vaccination strategies were found to be preferable to a culling only strategy. However, the optimal vaccination radius was found to be highly dependent upon vaccination capacity for all management objectives. We calculate that by resolving the uncertainty surrounding vaccination capacity we would expect to return over 85% of the above savings, regardless of management objective. It may be possible to resolve uncertainty about daily vaccination capacity before an outbreak, and this would enable decision makers to select the optimal control action via careful contingency planning. PMID:28207777

  19. Deep Uncertainties in Sea-Level Rise and Storm Surge Projections: Implications for Coastal Flood Risk Management.

    PubMed

    Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus

    2017-09-05

    Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.

  20. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  1. The Condition for Generous Trust.

    PubMed

    Shinya, Obayashi; Yusuke, Inagaki; Hiroki, Takikawa

    2016-01-01

    Trust has been considered the "cement" of a society and is much studied in sociology and other social sciences. Most studies, however, have neglected one important aspect of trust: it involves an act of forgiving and showing tolerance toward another's failure. In this study, we refer to this concept as "generous trust" and examine the conditions under which generous trust becomes a more viable option when compared to other types of trust. We investigate two settings. First, we introduce two types of uncertainties: uncertainty as to whether trustees have the intention to cooperate, and uncertainty as to whether trustees have enough competence to accomplish the entrusted tasks. Second, we examine the manner in which trust functions in a broader social context, one that involves matching and commitment processes. Since we expect generosity or forgiveness to work differently in the matching and commitment processes, we must differentiate trust strategies into generous trust in the matching process and that in the commitment process. Our analytical strategy is two-fold. First, we analyze the "modified" trust game that incorporates the two types of uncertainties without the matching process. This simplified setting enables us to derive mathematical results using game theory, thereby giving basic insight into the trust mechanism. Second, we investigate socially embedded trust relationships in contexts involving the matching and commitment processes, using agent-based simulation. Results show that uncertainty about partner's intention and competence makes generous trust a viable option. In contrast, too much uncertainty undermines the possibility of generous trust. Furthermore, a strategy that is too generous cannot stand alone. Generosity should be accompanied with moderate punishment. As for socially embedded trust relationships, generosity functions differently in the matching process versus the commitment process. Indeed, these two types of generous trust coexist, and their coexistence enables a society to function well.

  2. Health benefits and cost-effectiveness of a hybrid screening strategy for colorectal cancer.

    PubMed

    Dinh, Tuan; Ladabaum, Uri; Alperin, Peter; Caldwell, Cindy; Smith, Robert; Levin, Theodore R

    2013-09-01

    Colorectal cancer (CRC) screening guidelines recommend screening schedules for each single type of test except for concurrent sigmoidoscopy and fecal occult blood test (FOBT). We investigated the cost-effectiveness of a hybrid screening strategy that was based on a fecal immunological test (FIT) and colonoscopy. We conducted a cost-effectiveness analysis by using the Archimedes Model to evaluate the effects of different CRC screening strategies on health outcomes and costs related to CRC in a population that represents members of Kaiser Permanente Northern California. The Archimedes Model is a large-scale simulation of human physiology, diseases, interventions, and health care systems. The CRC submodel in the Archimedes Model was derived from public databases, published epidemiologic studies, and clinical trials. A hybrid screening strategy led to substantial reductions in CRC incidence and mortality, gains in quality-adjusted life years (QALYs), and reductions in costs, comparable with those of the best single-test strategies. Screening by annual FIT of patients 50-65 years old and then a single colonoscopy when they were 66 years old (FIT/COLOx1) reduced CRC incidence by 72% and gained 110 QALYs for every 1000 people during a period of 30 years, compared with no screening. Compared with annual FIT, FIT/COLOx1 gained 1400 QALYs/100,000 persons at an incremental cost of $9700/QALY gained and required 55% fewer FITs. Compared with FIT/COLOx1, colonoscopy at 10-year intervals gained 500 QALYs/100,000 at an incremental cost of $35,100/QALY gained but required 37% more colonoscopies. Over the ranges of parameters examined, the cost-effectiveness of hybrid screening strategies was slightly more sensitive to the adherence rate with colonoscopy than the adherence rate with yearly FIT. Uncertainties associated with estimates of FIT performance within a program setting and sensitivities for flat and right-sided lesions are expected to have significant impacts on the cost-effectiveness results. In our simulation model, a strategy of annual or biennial FIT, beginning when patients are 50 years old, with a single colonoscopy when they are 66 years old, delivers clinical and economic outcomes similar to those of CRC screening by single-modality strategies, with a favorable impact on resources demand. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.

  3. Improving the quantification of flash flood hydrographs and reducing their uncertainty using noncontact streamgauging methods

    NASA Astrophysics Data System (ADS)

    Branger, Flora; Dramais, Guillaume; Horner, Ivan; Le Boursicaud, Raphaël; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    Continuous river discharge data are crucial for the study and management of floods. In most river discharge monitoring networks, these data are obtained at gauging stations, where the stage-discharge relation is modelled with a rating curve to derive discharge from the measurement of water level in the river. Rating curves are usually established using individual ratings (or gaugings). However, using traditional gauging methods during flash floods is challenging for many reasons including hazardous flow conditions (for both equipment and people), short duration of the flood events, transient flows during the time needed to perform the gauging, etc. The lack of gaugings implies that the rating curve is often extrapolated well beyond the gauged range for the highest floods, inducing large uncertainties in the computed discharges. We deployed two remote techniques for gauging floods and improving stage-discharge relations for high flow conditions at several hydrometric stations throughout the Ardèche river catchment in France : (1) permanent video-recording stations enabling the implementation of the image analysis LS-PIV technique (Large Scale Particle Image Velocimetry) ; (2) and mobile gaugings using handheld Surface Velocity Radars (SVR). These gaugings were used to estimate the rating curve and its uncertainty using the Bayesian method BaRatin (Le Coz et al., 2014). Importantly, this method explicitly accounts for the uncertainty of individual gaugings, which is especially relevant for remote gaugings since their uncertainty is generally much higher than that of standard intrusive gauging methods. Then, the uncertainty of streamflow records was derived by combining the uncertainty of the rating curve and the uncertainty of stage records. We assessed the impact of these methodological developments for peak flow estimation and for flood descriptors at various time steps. The combination of field measurement innovation and statistical developments allows efficiently quantifying and reducing the uncertainties of flood peak estimates and flood descriptors at gauging stations. The noncontact streamgauging techniques used in our field campaign strategy have complementary interests. Permanent LSPIV stations, once installed and calibrated, can monitor floods automatically and perform many gaugings during a single event, thus documenting the rise, peak and recession of floods. SVR gaugings are more "one shot" gaugings but can be deployed quickly and at minimal cost over a large territory. Both of these noncontact techniques contribute to a significant reduction of uncertainty on peak hydrographs and flood descriptors at different time steps for a given catchment. Le Coz, J.; Renard, B.; Bonnifait, L.; Branger, F. & Le Boursicaud, R. (2014), 'Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: A Bayesian approach', Journal of Hydrology 509, 573-587.

  4. On the robust optimization to the uncertain vaccination strategy problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaerani, D., E-mail: d.chaerani@unpad.ac.id; Anggriani, N., E-mail: d.chaerani@unpad.ac.id; Firdaniza, E-mail: d.chaerani@unpad.ac.id

    2014-02-21

    In order to prevent an epidemic of infectious diseases, the vaccination coverage needs to be minimized and also the basic reproduction number needs to be maintained below 1. This means that as we get the vaccination coverage as minimum as possible, thus we need to prevent the epidemic to a small number of people who already get infected. In this paper, we discuss the case of vaccination strategy in term of minimizing vaccination coverage, when the basic reproduction number is assumed as an uncertain parameter that lies between 0 and 1. We refer to the linear optimization model for vaccinationmore » strategy that propose by Becker and Starrzak (see [2]). Assuming that there is parameter uncertainty involved, we can see Tanner et al (see [9]) who propose the optimal solution of the problem using stochastic programming. In this paper we discuss an alternative way of optimizing the uncertain vaccination strategy using Robust Optimization (see [3]). In this approach we assume that the parameter uncertainty lies within an ellipsoidal uncertainty set such that we can claim that the obtained result will be achieved in a polynomial time algorithm (as it is guaranteed by the RO methodology). The robust counterpart model is presented.« less

  5. Seniors' uncertainty management of direct-to-consumer prescription drug advertising usefulness.

    PubMed

    DeLorme, Denise E; Huh, Jisu

    2009-09-01

    This study provides insight into seniors' perceptions of and responses to direct-to-consumer prescription drug advertising (DTCA) usefulness, examines support for DTCA regulation as a type of uncertainty management, and extends and gives empirical voice to previous survey results through methodological triangulation. In-depth interview findings revealed that, for most informants, DTCA usefulness was uncertain and this uncertainty stemmed from 4 sources. The majority had negative responses to DTCA uncertainty and relied on 2 uncertainty-management strategies: information seeking from physicians, and inferences of and support for some government regulation of DTCA. Overall, the findings demonstrate the viability of uncertainty management theory (Brashers, 2001, 2007) for mass-mediated health communication, specifically DTCA. The article concludes with practical implications and research recommendations.

  6. UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E

    EPA Science Inventory

    A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...

  7. Quantum issues in optical communication. [noise reduction in signal reception

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.

    1973-01-01

    Various approaches to the problem of controlling quantum noise, the dominant noise in an optical communications system, are discussed. It is shown that, no matter which way the problem is approached, there always remain uncertainties. These uncertainties exist because, to date, only very few communication problems have been solved in their full quantum form.

  8. Information-integration category learning and the human uncertainty response.

    PubMed

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  9. A Single Bout of Aerobic Exercise Reduces Anxiety Sensitivity But Not Intolerance of Uncertainty or Distress Tolerance: A Randomized Controlled Trial.

    PubMed

    LeBouthillier, Daniel M; Asmundson, Gordon J G

    2015-01-01

    Several mechanisms have been posited for the anxiolytic effects of exercise, including reductions in anxiety sensitivity through interoceptive exposure. Studies on aerobic exercise lend support to this hypothesis; however, research investigating aerobic exercise in comparison to placebo, the dose-response relationship between aerobic exercise anxiety sensitivity, the efficacy of aerobic exercise on the spectrum of anxiety sensitivity and the effect of aerobic exercise on other related constructs (e.g. intolerance of uncertainty, distress tolerance) is lacking. We explored reductions in anxiety sensitivity and related constructs following a single session of exercise in a community sample using a randomized controlled trial design. Forty-one participants completed 30 min of aerobic exercise or a placebo stretching control. Anxiety sensitivity, intolerance of uncertainty and distress tolerance were measured at baseline, post-intervention and 3-day and 7-day follow-ups. Individuals in the aerobic exercise group, but not the control group, experienced significant reductions with moderate effect sizes in all dimensions of anxiety sensitivity. Intolerance of uncertainty and distress tolerance remained unchanged in both groups. Our trial supports the efficacy of aerobic exercise in uniquely reducing anxiety sensitivity in individuals with varying levels of the trait and highlights the importance of empirically validating the use of aerobic exercise to address specific mental health vulnerabilities. Aerobic exercise may have potential as a temporary substitute for psychotherapy aimed at reducing anxiety-related psychopathology.

  10. Strategies for the municipal waste management system to take advantage of carbon trading under competing policies: The role of energy from waste in Sydney

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Hanandeh, Ali; El-Zein, Abbas

    2009-07-15

    Climate change is a driving force behind some recent environmental legislation around the world. Greenhouse gas emission reduction targets have been set in many industrialised countries. A change in current practices of almost all greenhouse-emitting industrial sectors is unavoidable, if the set targets is to be achieved. Although, waste disposal contributes around 3% of the total greenhouse gas emissions in Australia (mainly due to fugitive methane emissions from landfills), the carbon credit and trading scheme set to start in 2010 presents significant challenges and opportunities to municipal solid waste practitioners. Technological advances in waste management, if adopted properly, allow themore » municipal solid waste sector to act as carbon sink, hence earning tradable carbon credits. However, due to the complexity of the system and its inherent uncertainties, optimizing it for carbon credits may worsen its performance under other criteria. We use an integrated, stochastic multi-criteria decision-making tool that we developed earlier to analyse the carbon credit potential of Sydney municipal solid waste under eleven possible future strategies. We find that the changing legislative environment is likely to make current practices highly non-optimal and increase pressures for a change of waste management strategy.« less

  11. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  12. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed Central

    Kanamori, H

    1996-01-01

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding. Images Fig. 8 PMID:11607657

  13. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  14. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    USGS Publications Warehouse

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.

  15. Hydrologic response to forest cover changes following a Mountain Pine Beetle outbreak in the context of a changing climate

    NASA Astrophysics Data System (ADS)

    Moore, Dan; Jost, Georg; Nelson, Harry; Smith, Russell

    2013-04-01

    Over the last 15 years, there has been extensive mortality of pine forests in western North America associated with an outbreak of Mountain Pine Beetle, often followed by salvage logging. The objective of this study was to quantify the separate and combined effects of forest recovery and climate change over the 21st century on catchment hydrology in the San Jose watershed, located in the semi-arid Interior Plateau of British Columbia. Forest cover changes were simulated using a dynamic spatial model that uses a decentralized planning approach. We implemented management strategies representing current timber management objectives around achieving targeted harvest levels and incorporating existing management constraints under two different scenarios, one with no climate change and one under climate change, using climate-adjusted growth and yield curves. In addition, higher rates of fire disturbance were modelled under climate change. Under climate change, while productivity improves for some species (mainly Douglas-fir on better quality sites), on drier and poorer quality sites most species, especially Lodgepole Pine, become significantly less productive, and stocking is reduced to the point that those sites transition into grasslands. The combined effect of initial age classes (where the forest has been severely impacted by MPB), increased fire, and reduced stocking results in a greater proportion of the forest in younger age classes compared to a "Business As Usual" scenario with no climate change. The hydrologic responses to changes in vegetation cover and climate were evaluated with the flexible Hydrology Emulator and Modelling Platform (HEMP) developed at the University of British Columbia. HEMP allows a flexible discretization of the landscape. Water is moved vertically within landscape units by processes such as precipitation, canopy interception and soil infiltration, and routed laterally between units as a function of local soil and groundwater storage. The model was calibrated and tested on three stream gauges and on snow course data. A 'guided' GLUE approach was used to address the effects of parameter uncertainty and uncertainty in streamflow data on the uncertainty in future projections. Overall, the establishment and growth of post-disturbance forest stands result in a substantial reduction in snow accumulation and melt rates, and an increase in evapotranspiration, together resulting in a reduction in streamflow. The influence of projected climate warming was to advance the timing of spring melt, exacerbating the reductions in late-summer streamflow associated with forest recovery. In some climate scenarios, increases in precipitation helped to offset reductions in streamflow associated with forest recovery. Some challenges associated with linking output from the forest dynamics simulations and the hydrologic model are identified and potential solutions discussed.

  16. The Impact of Mission Duration on a Mars Orbital Mission

    NASA Technical Reports Server (NTRS)

    Arney, Dale; Earle, Kevin; Cirillo, Bill; Jones, Christopher; Klovstad, Jordan; Grande, Melanie; Stromgren, Chel

    2017-01-01

    Performance alone is insufficient to assess the total impact of changing mission parameters on a space mission concept, architecture, or campaign; the benefit, cost, and risk must also be understood. This paper examines the impact to benefit, cost, and risk of changing the total mission duration of a human Mars orbital mission. The changes in the sizing of the crew habitat, including consumables and spares, was assessed as a function of duration, including trades of different life support strategies; this was used to assess the impact on transportation system requirements. The impact to benefit is minimal, while the impact on cost is dominated by the increases in transportation costs to achieve shorter total durations. The risk is expected to be reduced by decreasing total mission duration; however, large uncertainty exists around the magnitude of that reduction.

  17. Space radiation concerns for manned exploration.

    PubMed

    Stanford, M; Jones, J A

    1999-07-01

    Spaceflight exposes astronaut crews to natural ionizing radiation. To date, exposures in manned spaceflight have been well below the career limits recommended to NASA by the National Council of Radiation Protection and Measurements (NCRP). This will not be the case for long-duration exploratory class missions. Additionally. International Space Station (ISS) crews will receive higher doses than earlier flight crews. Uncertainties in our understanding of long-term bioeffects, as well as updated analyses of the Hiroshima. Nagasaki and Chernobyl tumorigenesis data, have prompted the NCRP to recommend further reductions by 30-50% for career dose limit guidelines. Intelligent spacecraft design and material selection can provide a shielding strategy capable of maintaining crew exposures within recommended guidelines. Current studies on newer radioprotectant compounds may find combinations of agents which further diminish the risk of radiation-induced bioeffects to the crew.

  18. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/

  19. Optimization Control of the Color-Coating Production Process for Model Uncertainty

    PubMed Central

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563

  20. Optimization Control of the Color-Coating Production Process for Model Uncertainty.

    PubMed

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.

  1. Implicit knowledge of visual uncertainty guides decisions with asymmetric outcomes.

    PubMed

    Whiteley, Louise; Sahani, Maneesh

    2008-03-06

    Perception is an "inverse problem," in which the state of the world must be inferred from the sensory neural activity that results. However, this inference is both ill-posed (Helmholtz, 1856; Marr, 1982) and corrupted by noise (Green & Swets, 1989), requiring the brain to compute perceptual beliefs under conditions of uncertainty. Here we show that human observers performing a simple visual choice task under an externally imposed loss function approach the optimal strategy, as defined by Bayesian probability and decision theory (Berger, 1985; Cox, 1961). In concert with earlier work, this suggests that observers possess a model of their internal uncertainty and can utilize this model in the neural computations that underlie their behavior (Knill & Pouget, 2004). In our experiment, optimal behavior requires that observers integrate the loss function with an estimate of their internal uncertainty rather than simply requiring that they use a modal estimate of the uncertain stimulus. Crucially, they approach optimal behavior even when denied the opportunity to learn adaptive decision strategies based on immediate feedback. Our data thus support the idea that flexible representations of uncertainty are pre-existing, widespread, and can be propagated to decision-making areas of the brain.

  2. Development of risk management strategies for state DOTs to effectively deal with volatile prices of transportation construction materials.

    DOT National Transportation Integrated Search

    2014-06-01

    Volatility in price of critical materials used in transportation projects, such as asphalt cement, leads to : considerable uncertainty about project cost. This uncertainty may lead to price speculation and inflated : bid prices submitted by highway c...

  3. Cost-utility analysis of screening for diabetic retinopathy in Japan: a probabilistic Markov modeling study.

    PubMed

    Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu

    2015-02-01

    To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.

  4. Electric Vehicles Charging Scheduling Strategy Considering the Uncertainty of Photovoltaic Output

    NASA Astrophysics Data System (ADS)

    Wei, Xiangxiang; Su, Su; Yue, Yunli; Wang, Wei; He, Luobin; Li, Hao; Ota, Yutaka

    2017-05-01

    The rapid development of electric vehicles and distributed generation bring new challenges to security and economic operation of the power system, so the collaborative research of the EVs and the distributed generation have important significance in distribution network. Under this background, an EVs charging scheduling strategy considering the uncertainty of photovoltaic(PV) output is proposed. The characteristics of EVs charging are analysed first. A PV output prediction method is proposed with a PV database then. On this basis, an EVs charging scheduling strategy is proposed with the goal to satisfy EVs users’ charging willingness and decrease the power loss in distribution network. The case study proves that the proposed PV output prediction method can predict the PV output accurately and the EVs charging scheduling strategy can reduce the power loss and stabilize the fluctuation of the load in distributed network.

  5. Number-phase minimum-uncertainty state with reduced number uncertainty in a Kerr nonlinear interferometer

    NASA Astrophysics Data System (ADS)

    Kitagawa, M.; Yamamoto, Y.

    1987-11-01

    An alternative scheme for generating amplitude-squeezed states of photons based on unitary evolution which can properly be described by quantum mechanics is presented. This scheme is a nonlinear Mach-Zehnder interferometer containing an optical Kerr medium. The quasi-probability density (QPD) and photon-number distribution of the output field are calculated, and it is demonstrated that the reduced photon-number uncertainty and enhanced phase uncertainty maintain the minimum-uncertainty product. A self-phase-modulation of the single-mode quantized field in the Kerr medium is described based on localized operators. The spatial evolution of the state is demonstrated by QPD in the Schroedinger picture. It is shown that photon-number variance can be reduced to a level far below the limit for an ordinary squeezed state, and that the state prepared using this scheme remains a number-phase minimum-uncertainty state until the maximum reduction of number fluctuations is surpassed.

  6. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  7. Global Aerosol Direct Radiative Effect From CALIOP and C3M

    NASA Technical Reports Server (NTRS)

    Winker, Dave; Kato, Seiji; Tackett, Jason

    2015-01-01

    Aerosols are responsible for the largest uncertainties in current estimates of climate forcing. These uncertainties are due in part to the limited abilities of passive sensors to retrieve aerosols in cloudy skies. We use a dataset which merges CALIOP observations together with other A-train observations to estimate aerosol radiative effects in cloudy skies as well as in cloud-free skies. The results can be used to quantify the reduction of aerosol radiative effects in cloudy skies relative to clear skies and to reduce current uncertainties in aerosol radiative effects.

  8. Global Aerosol Direct Radiative Effect from CALIOP and C3M

    NASA Technical Reports Server (NTRS)

    Winker, Dave; Kato, Seiji; Tackett, Jason

    2015-01-01

    Aerosols are responsible for the largest uncertainties in current estimates of climate forcing. These uncertainties are due in part to the limited abilities of passive sensors to retrieve aerosols in cloudy skies. We use a dataset which merges CALIOP observations together with other A-train observations to estimate aerosol radiative effects in cloudy skies as well as in cloud-free skies. The results can be used to quantify the reduction of aerosol radiative effects in cloudy skies relative to clear skies and to reduce current uncertainties in aerosol radiative effects.

  9. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  10. Frameworks and tools for risk assessment of manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Gottardo, Stefania; Semenzin, Elena; Oomen, Agnes; Bos, Peter; Peijnenburg, Willie; van Tongeren, Martie; Nowack, Bernd; Hunt, Neil; Brunelli, Andrea; Scott-Fordsmand, Janeck J; Tran, Lang; Marcomini, Antonio

    2016-10-01

    Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Setting the most robust effluent level under severe uncertainty: application of information-gap decision theory to chemical management.

    PubMed

    Yokomizo, Hiroyuki; Naito, Wataru; Tanaka, Yoshinari; Kamo, Masashi

    2013-11-01

    Decisions in ecological risk management for chemical substances must be made based on incomplete information due to uncertainties. To protect the ecosystems from the adverse effect of chemicals, a precautionary approach is often taken. The precautionary approach, which is based on conservative assumptions about the risks of chemical substances, can be applied selecting management models and data. This approach can lead to an adequate margin of safety for ecosystems by reducing exposure to harmful substances, either by reducing the use of target chemicals or putting in place strict water quality criteria. However, the reduction of chemical use or effluent concentrations typically entails a financial burden. The cost effectiveness of the precautionary approach may be small. Hence, we need to develop a formulaic methodology in chemical risk management that can sufficiently protect ecosystems in a cost-effective way, even when we do not have sufficient information for chemical management. Information-gap decision theory can provide the formulaic methodology. Information-gap decision theory determines which action is the most robust to uncertainty by guaranteeing an acceptable outcome under the largest degree of uncertainty without requiring information about the extent of parameter uncertainty at the outset. In this paper, we illustrate the application of information-gap decision theory to derive a framework for setting effluent limits of pollutants for point sources under uncertainty. Our application incorporates a cost for reduction in pollutant emission and a cost to wildlife species affected by the pollutant. Our framework enables us to settle upon actions to deal with severe uncertainty in ecological risk management of chemicals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Air Force Air Refueling: The KC-X Aircraft Acquisition Program

    DTIC Science & Technology

    2008-04-04

    13 National Military Strategy (NMS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Mobility Capability Study...condition and sustainment costs of the KC-135” ... and that “an early replacement program would be a hedging strategy against that uncertainty.”40...the President’s overall national security strategy . Based on the President’s strategy , DOD periodically studies the global threat environment and

  13. Detailed Uncertainty Analysis of the Ares I A106 Liftoff/Transition Database

    NASA Technical Reports Server (NTRS)

    Hanke, Jeremy L.

    2011-01-01

    The Ares I A106 Liftoff/Transition Force and Moment Aerodynamics Database describes the aerodynamics of the Ares I Crew Launch Vehicle (CLV) from the moment of liftoff through the transition from high to low total angles of attack at low subsonic Mach numbers. The database includes uncertainty estimates that were developed using a detailed uncertainty quantification procedure. The Ares I Aerodynamics Panel developed both the database and the uncertainties from wind tunnel test data acquired in the NASA Langley Research Center s 14- by 22-Foot Subsonic Wind Tunnel Test 591 using a 1.75 percent scale model of the Ares I and the tower assembly. The uncertainty modeling contains three primary uncertainty sources: experimental uncertainty, database modeling uncertainty, and database query interpolation uncertainty. The final database and uncertainty model represent a significant improvement in the quality of the aerodynamic predictions for this regime of flight over the estimates previously used by the Ares Project. The maximum possible aerodynamic force pushing the vehicle towards the launch tower assembly in a dispersed case using this database saw a 40 percent reduction from the worst-case scenario in previously released data for Ares I.

  14. The Development of a Diagnostic-Prescriptive Tool for Undergraduates Seeking Information for a Social Science/Humanities Assignment. III. Enabling Devices.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Ungar, Andras

    2000-01-01

    This article focuses on a study of undergraduates writing an essay for a remedial writing course that tested two devices, an uncertainty expansion device and an uncertainty reduction device. Highlights include Kuhlthau's information search process model, and enabling technology devices for the information needs of information retrieval system…

  15. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  16. Bayesian Monte Carlo and Maximum Likelihood Approach for Uncertainty Estimation and Risk Management: Application to Lake Oxygen Recovery Model

    EPA Science Inventory

    Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...

  17. Large contribution of natural aerosols to uncertainty in indirect forcing

    NASA Astrophysics Data System (ADS)

    Carslaw, K. S.; Lee, L. A.; Reddington, C. L.; Pringle, K. J.; Rap, A.; Forster, P. M.; Mann, G. W.; Spracklen, D. V.; Woodhouse, M. T.; Regayre, L. A.; Pierce, J. R.

    2013-11-01

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  18. An integrative cross-design synthesis approach to estimate the cost of illness: an applied case to the cost of depression in Catalonia.

    PubMed

    Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis

    2013-04-01

    Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.

  19. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  20. Emissions of halocarbons from mobile vehicle air conditioning system in Hong Kong.

    PubMed

    Yan, H H; Guo, H; Ou, J M

    2014-08-15

    During the implementation of Montreal Protocol, emission inventories of halocarbons in different sectors at regional scale are fundamental to the formulation of relevant management strategy and inspection of the implementation efficiency. This study investigated the emission profile of halocarbons used in the mobile vehicle air conditioning system, the leading sector of refrigeration industry in terms of the refrigerant bank, market and emission, in the Hong Kong Special Administrative Region, using a bottom-up approach developed by 2006 IPCC Good Practice Guidance. The results showed that emissions of CFC-12 peaked at 53 tons ODP (Ozone Depletion Potential) in 1992 and then gradually diminished, whereas HFC-134a presented an increasing emission trend since 1990s and the emissions of HFC-134a reached 65,000 tons CO2-equivelant (CO2-eq) by the end of 2011. Uncertainty analysis revealed relatively high levels of uncertainties for special-purpose vehicles and government vehicles. Moreover, greenhouse gas (GHG) abatements under different scenarios indicated that potential emission reduction of HFC-134a ranged from 4.1 to 8.4 × 10(5)tons CO2-eq. The findings in this study advance our knowledge of halocarbon emissions from mobile vehicle air conditioning system in Hong Kong. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Agent-based station for on-line diagnostics by self-adaptive laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Serafini, S.; Paone, N.; Castellini, P.

    2013-12-01

    A self-adaptive diagnostic system based on laser vibrometry is proposed for quality control of mechanical defects by vibration testing; it is developed for appliances at the end of an assembly line, but its characteristics are generally suited for testing most types of electromechanical products. It consists of a laser Doppler vibrometer, equipped with scanning mirrors and a camera, which implements self-adaptive bahaviour for optimizing the measurement. The system is conceived as a Quality Control Agent (QCA) and it is part of a Multi Agent System that supervises all the production line. The QCA behaviour is defined so to minimize measurement uncertainty during the on-line tests and to compensate target mis-positioning under guidance of a vision system. Best measurement conditions are reached by maximizing the amplitude of the optical Doppler beat signal (signal quality) and consequently minimize uncertainty. In this paper, the optimization strategy for measurement enhancement achieved by the down-hill algorithm (Nelder-Mead algorithm) and its effect on signal quality improvement is discussed. Tests on a washing machine in controlled operating conditions allow to evaluate the efficacy of the method; significant reduction of noise on vibration velocity spectra is observed. Results from on-line tests are presented, which demonstrate the potential of the system for industrial quality control.

  2. An adaptive strategy for active debris removal

    NASA Astrophysics Data System (ADS)

    White, Adam E.; Lewis, Hugh G.

    2014-04-01

    Many parameters influence the evolution of the near-Earth debris population, including launch, solar, explosion and mitigation activities, as well as other future uncertainties such as advances in space technology or changes in social and economic drivers that effect the utilisation of space activities. These factors lead to uncertainty in the long-term debris population. This uncertainty makes it difficult to identify potential remediation strategies, involving active debris removal (ADR), that will perform effectively in all possible future cases. Strategies that cannot perform effectively, because of this uncertainty, risk either not achieving their intended purpose, or becoming a hindrance to the efforts of spacecraft manufactures and operators to address the challenges posed by space debris. One method to tackle this uncertainty is to create a strategy that can adapt and respond to the space debris population. This work explores the concept of an adaptive strategy, in terms of the number of objects required to be removed by ADR, to prevent the low Earth orbit (LEO) debris population from growing in size. This was demonstrated by utilising the University of Southampton’s Debris Analysis and Monitoring Architecture to the Geosynchronous Environment (DAMAGE) tool to investigate ADR rates (number of removals per year) that change over time in response to the current space environment, with the requirement of achieving zero growth of the LEO population. DAMAGE was used to generate multiple Monte Carlo projections of the future LEO debris environment. Within each future projection, the debris removal rate was derived at five-year intervals, by a new statistical debris evolutionary model called the Computational Adaptive Strategy to Control Accurately the Debris Environment (CASCADE) model. CASCADE predicted the long-term evolution of the current DAMAGE population with a variety of different ADR rates in order to identify a removal rate that produced a zero net growth for that particular projection after 200 years. The results show that using an adaptive ADR rate generated by CASCADE, alongside good compliance with existing mitigation measures, increases the probability of achieving a constant LEO population of objects greater than 10 cm. This was shown to be 12% greater compared with removing five objects per year, with the additional advantage of requiring only 3.1 removals per year, on average.

  3. Systematic Evaluation of Stochastic Methods in Power System Scheduling and Dispatch with Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yishen; Zhou, Zhi; Liu, Cong

    2016-08-01

    As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less

  4. Poverty Reduction and the World Bank. Progress in Fiscal 1996 and 1997.

    ERIC Educational Resources Information Center

    World Bank, Washington, DC.

    This report reviews progress in implementation of the World Bank's poverty reduction strategy during fiscal 1996-97. Chapter 1, "The World Bank's Poverty Reduction Strategy and Future Directions," outlines elements in the poverty reduction strategy: policies to promote broad-based labor-demanding growth and increase the productivity and…

  5. Fuel Efficient Strategies for Reducing Contrail Formations in United States Air Space

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Chen, Neil Y.; Ng, Hok K.

    2010-01-01

    This paper describes a class of strategies for reducing persistent contrail formation in the United States airspace. The primary objective is to minimize potential contrail formation regions by altering the aircraft's cruising altitude in a fuel-efficient way. The results show that the contrail formations can be reduced significantly without extra fuel consumption and without adversely affecting congestion in the airspace. The contrail formations can be further reduced by using extra fuel. For the day tested, the maximal reduction strategy has a 53% contrail reduction rate. The most fuel-efficient strategy has an 8% reduction rate with 2.86% less fuel-burnt compared to the maximal reduction strategy. Using a cost function which penalizes extra fuel consumed while maximizing the amount of contrail reduction provides a flexible way to trade off between contrail reduction and fuel consumption. It can achieve a 35% contrail reduction rate with only 0.23% extra fuel consumption. The proposed fuel-efficient contrail reduction strategy provides a solution to reduce aviation-induced environmental impact on a daily basis.

  6. Wave-optics uncertainty propagation and regression-based bias model in GNSS radio occultation bending angle retrievals

    NASA Astrophysics Data System (ADS)

    Gorbunov, Michael E.; Kirchengast, Gottfried

    2018-01-01

    A new reference occultation processing system (rOPS) will include a Global Navigation Satellite System (GNSS) radio occultation (RO) retrieval chain with integrated uncertainty propagation. In this paper, we focus on wave-optics bending angle (BA) retrieval in the lower troposphere and introduce (1) an empirically estimated boundary layer bias (BLB) model then employed to reduce the systematic uncertainty of excess phases and bending angles in about the lowest 2 km of the troposphere and (2) the estimation of (residual) systematic uncertainties and their propagation together with random uncertainties from excess phase to bending angle profiles. Our BLB model describes the estimated bias of the excess phase transferred from the estimated bias of the bending angle, for which the model is built, informed by analyzing refractivity fluctuation statistics shown to induce such biases. The model is derived from regression analysis using a large ensemble of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) RO observations and concurrent European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields. It is formulated in terms of predictors and adaptive functions (powers and cross products of predictors), where we use six main predictors derived from observations: impact altitude, latitude, bending angle and its standard deviation, canonical transform (CT) amplitude, and its fluctuation index. Based on an ensemble of test days, independent of the days of data used for the regression analysis to establish the BLB model, we find the model very effective for bias reduction and capable of reducing bending angle and corresponding refractivity biases by about a factor of 5. The estimated residual systematic uncertainty, after the BLB profile subtraction, is lower bounded by the uncertainty from the (indirect) use of ECMWF analysis fields but is significantly lower than the systematic uncertainty without BLB correction. The systematic and random uncertainties are propagated from excess phase to bending angle profiles, using a perturbation approach and the wave-optical method recently introduced by Gorbunov and Kirchengast (2015), starting with estimated excess phase uncertainties. The results are encouraging and this uncertainty propagation approach combined with BLB correction enables a robust reduction and quantification of the uncertainties of excess phases and bending angles in the lower troposphere.

  7. Cigarette price minimization strategies in the United States: price reductions and responsiveness to excise taxes.

    PubMed

    Pesko, Michael F; Licht, Andrea S; Kruger, Judy M

    2013-11-01

    Because cigarette price minimization strategies can provide substantial price reductions for individuals continuing their usual smoking behaviors following federal and state cigarette excise tax increases, we examined independent price reductions compensating for overlapping strategies. The possible availability of larger independent price reduction opportunities in states with higher cigarette excise taxes is explored. Regression analysis used the 2006-2007 Tobacco Use Supplement of the Current Population Survey (N = 26,826) to explore national and state-level independent price reductions that smokers obtained from purchasing cigarettes (a) by the carton, (b) in a state with a lower average after-tax cigarette price than in the state of residence, and (c) in "some other way," including online or in another country. Price reductions from these strategies are estimated jointly to compensate for known overlapping strategies. Each strategy reduced the price of cigarettes by 64-94 cents per pack. These price reductions are 9%-22% lower than conventionally estimated results not compensating for overlapping strategies. Price reductions vary substantially by state. Following cigarette excise tax increases, the price reduction available from purchasing cigarettes by cartons increased. Additionally, the price reduction from purchasing cigarettes in a state with a lower average after-tax cigarette price is positively associated with state cigarette excise tax rates and border state cigarette excise tax rate differentials. Findings from this large, nationally representative study of cigarette smokers suggest that price reductions are larger in states with higher cigarette excise taxes, and increase as cigarette excise taxes rise.

  8. Effect of Natural Organic Matter on the Reduction of Nitroaromatics by Fe(II) Species

    EPA Science Inventory

    Although natural organic matter is a necessary electron source for the microbial mediated development of redox zones in nature, uncertainty still exists regarding its role(s) in the reduction of chemicals. This work studied the effect of Suwannee river humic acid (SRHA) on the r...

  9. Research on green supply chain coordination strategy for uncertain market demand.

    PubMed

    Cao, Jian; Chen, Yangyang; Lu, Bo; Tong, Chenlu; Zhou, Gengui

    2015-03-01

    Based on the status that the green market began to develop (e.g. pharmaceutical industry) in Mainland China, the paper mainly discusses how members of the green supply chain (GSC) cooperate effectively in the process of the supply chain operations. For the uncertainties existing in the market demand of the green products, the GSC coordination strategy is put forward based on the Stackelberg game that the manufacturer is the leader and distributors are the followers. The relationship between the proposed coordination strategy and several factors including the distributor's amount, the distributor's risk aversion and the uncertainties of market demand are analyzed. It indicates that, when there are uncertainties existing in the market demand of the green product, the revenue of each enterprise, the overall revenue and the customer's welfare all decrease; while the increase in the number of distributors and low risk aversion of them are beneficial to the entire GSC and the customer. The conclusions have good guidance for the operational decisions of the green supply chain when the green market is in its initial formation.

  10. Numerical solution of a conspicuous consumption model with constant control delay☆

    PubMed Central

    Huschto, Tony; Feichtinger, Gustav; Hartl, Richard F.; Kort, Peter M.; Sager, Sebastian; Seidl, Andrea

    2011-01-01

    We derive optimal pricing strategies for conspicuous consumption products in periods of recession. To that end, we formulate and investigate a two-stage economic optimal control problem that takes uncertainty of the recession period length and delay effects of the pricing strategy into account. This non-standard optimal control problem is difficult to solve analytically, and solutions depend on the variable model parameters. Therefore, we use a numerical result-driven approach. We propose a structure-exploiting direct method for optimal control to solve this challenging optimization problem. In particular, we discretize the uncertainties in the model formulation by using scenario trees and target the control delays by introduction of slack control functions. Numerical results illustrate the validity of our approach and show the impact of uncertainties and delay effects on optimal economic strategies. During the recession, delayed optimal prices are higher than the non-delayed ones. In the normal economic period, however, this effect is reversed and optimal prices with a delayed impact are smaller compared to the non-delayed case. PMID:22267871

  11. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  12. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  13. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  14. Modelling the fate of organic micropollutants in stormwater ponds.

    PubMed

    Vezzaro, Luca; Eriksson, Eva; Ledin, Anna; Mikkelsen, Peter S

    2011-06-01

    Urban water managers need to estimate the potential removal of organic micropollutants (MP) in stormwater treatment systems to support MP pollution control strategies. This study documents how the potential removal of organic MP in stormwater treatment systems can be quantified by using multimedia models. The fate of four different MP in a stormwater retention pond was simulated by applying two steady-state multimedia fate models (EPI Suite and SimpleBox) commonly applied in chemical risk assessment and a dynamic multimedia fate model (Stormwater Treatment Unit Model for Micro Pollutants--STUMP). The four simulated organic stormwater MP (iodopropynyl butylcarbamate--IPBC, benzene, glyphosate and pyrene) were selected according to their different urban sources and environmental fate. This ensures that the results can be extended to other relevant stormwater pollutants. All three models use substance inherent properties to calculate MP fate but differ in their ability to represent the small physical scale and high temporal variability of stormwater treatment systems. Therefore the three models generate different results. A Global Sensitivity Analysis (GSA) highlighted that settling/resuspension of particulate matter was the most sensitive process for the dynamic model. The uncertainty of the estimated MP fluxes can be reduced by calibrating the dynamic model against total suspended solids data. This reduction in uncertainty was more significant for the substances with strong tendency to sorb, i.e. glyphosate and pyrene and less significant for substances with a smaller tendency to sorb, i.e. IPBC and benzene. The results provide support to the elaboration of MP pollution control strategies by limiting the need for extensive and complex monitoring campaigns targeting the wide range of specific organic MP found in stormwater runoff. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Assessing risks and preventing disease from environmental chemicals.

    PubMed

    Dunnette, D A

    1989-01-01

    In the last 25 years there has been considerable concern expressed about the extent to which chemical agents in the ambient and work environments are contributing to the causation of disease. This concern is a logical extension of our increased knowledge of the real and potential effects of environmental chemicals and the methodological difficulties in applying new knowledge that could help prevent environmentally induced disease. Chemical risk assessment offers an approach to estimating risks and involves consideration of relevant information including identification of chemical hazards, evaluation of the dose-response relationship, estimation of exposure and finally, risk characterization. Particularly significant uncertainties which are inherent in use of this and other risk models include animal-human and low dose-high dose extrapolation and estimation of exposure. Community public health risks from exposure to environmental chemicals appear to be small relative to other public health risks based on information related to cancer trends, dietary intake of synthetic chemicals, assessment data on substances such as DDT and "dioxin," public health effects of hazardous waste sites and contextual considerations. Because of inherent uncertainty in the chemical risk assessment process, however, we need to apply what methods are available in our efforts to prevent disease induced by environmental chemicals. There are a number of societal strategies which can contribute to overall reduction of risk from environmental chemicals. These include acquisition of information on environmental risk including toxicity, intensity and extensity of exposure, biological monitoring, disease surveillance, improvement in epidemiological methods, control of environmental chemical exposures, and dissemination of hazardous chemical information. Responsible environmental risk communication and information transfer appear to be among the most important of the available strategies for preventing disease induced by chemicals in the environment.

  16. The Foundation Supernova Survey: motivation, design, implementation, and first data release

    NASA Astrophysics Data System (ADS)

    Foley, Ryan J.; Scolnic, Daniel; Rest, Armin; Jha, S. W.; Pan, Y.-C.; Riess, A. G.; Challis, P.; Chambers, K. C.; Coulter, D. A.; Dettman, K. G.; Foley, M. M.; Fox, O. D.; Huber, M. E.; Jones, D. O.; Kilpatrick, C. D.; Kirshner, R. P.; Schultz, A. S. B.; Siebert, M. R.; Flewelling, H. A.; Gibson, B.; Magnier, E. A.; Miller, J. A.; Primak, N.; Smartt, S. J.; Smith, K. W.; Wainscoat, R. J.; Waters, C.; Willman, M.

    2018-03-01

    The Foundation Supernova Survey aims to provide a large, high-fidelity, homogeneous, and precisely calibrated low-redshift Type Ia supernova (SN Ia) sample for cosmology. The calibration of the current low-redshift SN sample is the largest component of systematic uncertainties for SN cosmology, and new data are necessary to make progress. We present the motivation, survey design, observation strategy, implementation, and first results for the Foundation Supernova Survey. We are using the Pan-STARRS telescope to obtain photometry for up to 800 SNe Ia at z ≲ 0.1. This strategy has several unique advantages: (1) the Pan-STARRS system is a superbly calibrated telescopic system, (2) Pan-STARRS has observed 3/4 of the sky in grizyP1 making future template observations unnecessary, (3) we have a well-tested data-reduction pipeline, and (4) we have observed ˜3000 high-redshift SNe Ia on this system. Here, we present our initial sample of 225 SN Ia grizP1 light curves, of which 180 pass all criteria for inclusion in a cosmological sample. The Foundation Supernova Survey already contains more cosmologically useful SNe Ia than all other published low-redshift SN Ia samples combined. We expect that the systematic uncertainties for the Foundation Supernova Sample will be two to three times smaller than other low-redshift samples. We find that our cosmologically useful sample has an intrinsic scatter of 0.111 mag, smaller than other low-redshift samples. We perform detailed simulations showing that simply replacing the current low-redshift SN Ia sample with an equally sized Foundation sample will improve the precision on the dark energy equation-of-state parameter by 35 per cent, and the dark energy figure of merit by 72 per cent.

  17. The increasing importance of a continence nurse specialist to improve outcomes and save costs of urinary incontinence care: an analysis of future policy scenarios.

    PubMed

    Franken, Margreet G; Corro Ramos, Isaac; Los, Jeanine; Al, Maiwenn J

    2018-02-17

    In an ageing population, it is inevitable to improve the management of care for community-dwelling elderly with incontinence. A previous study showed that implementation of the Optimum Continence Service Specification (OCSS) for urinary incontinence in community-dwelling elderly with four or more chronic diseases results in a reduction of urinary incontinence, an improved quality of life, and lower healthcare and lower societal costs. The aim of this study was to explore future consequences of the OCSS strategy of various healthcare policy scenarios in an ageing population. We adapted a previously developed decision analytical model in which the OCSS new care strategy was operationalised as the appointment of a continence nurse specialist located within the general practice in The Netherlands. We used a societal perspective including healthcare costs (healthcare providers, treatment costs, insured containment products, insured home care), and societal costs (informal caregiving, containment products paid out-of-pocket, travelling expenses, home care paid out-of-pocket). All outcomes were computed over a three-year time period using two different base years (2014 and 2030). Settings for future policy scenarios were based on desk-research and expert opinion. Our results show that implementation of the OSCC new care strategy for urinary incontinence would yield large health gains in community dwelling elderly (2030: 2592-2618 QALYs gained) and large cost-savings in The Netherlands (2030: health care perspective: €32.4 Million - €72.5 Million; societal perspective: €182.0 Million - €250.6 Million). Savings can be generated in different categories which depends on healthcare policy. The uncertainty analyses and extreme case scenarios showed the robustness of the results. Implementation of the OCSS new care strategy for urinary incontinence results in an improvement in the quality of life of community-dwelling elderly, a reduction of the costs for payers and affected elderly, and a reduction in time invested by carers. Various realistic policy scenarios even forecast larger health gains and cost-savings in the future. More importantly, the longer the implementation is postponed the larger the savings foregone. The future organisation of healthcare affects the category in which the greatest savings will be generated.

  18. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  19. Use of meteorological information in the risk analysis of a mixed wind farm and solar

    NASA Astrophysics Data System (ADS)

    Mengelkamp, H.-T.; Bendel, D.

    2010-09-01

    Use of meteorological information in the risk analysis of a mixed wind farm and solar power plant portfolio H.-T. Mengelkamp*,** , D. Bendel** *GKSS Research Center Geesthacht GmbH **anemos Gesellschaft für Umweltmeteorologie mbH The renewable energy industry has rapidly developed during the last two decades and so have the needs for high quality comprehensive meteorological services. It is, however, only recently that international financial institutions bundle wind farms and solar power plants and offer shares in these aggregate portfolios. The monetary value of a mixed wind farm and solar power plant portfolio is determined by legal and technical aspects, the expected annual energy production of each wind farm and solar power plant and the associated uncertainty of the energy yield estimation or the investment risk. Building an aggregate portfolio will reduce the overall uncertainty through diversification in contrast to the single wind farm/solar power plant energy yield uncertainty. This is similar to equity funds based on a variety of companies or products. Meteorological aspects contribute to the diversification in various ways. There is the uncertainty in the estimation of the expected long-term mean energy production of the wind and solar power plants. Different components of uncertainty have to be considered depending on whether the power plant is already in operation or in the planning phase. The uncertainty related to a wind farm in the planning phase comprises the methodology of the wind potential estimation and the uncertainty of the site specific wind turbine power curve as well as the uncertainty of the wind farm effect calculation. The uncertainty related to a solar power plant in the pre-operational phase comprises the uncertainty of the radiation data base and that of the performance curve. The long-term mean annual energy yield of operational wind farms and solar power plants is estimated on the basis of the actual energy production and it's relation to a climatologically stable long-term reference period. These components of uncertainty are of technical nature and based on subjective estimations rather than on a statistically sound data analysis. And then there is the temporal and spatial variability of the wind speed and radiation. Their influence on the overall risk is determined by the regional distribution of the power plants. These uncertainty components are calculated on the basis of wind speed observations and simulations and satellite derived radiation data. The respective volatility (temporal variability) is calculated from the site specific time series and the influence on the portfolio through regional correlation. For an exemplary portfolio comprising fourteen wind farms and eight solar power plants the annual mean energy production to be expected is calculated, the different components of uncertainty are estimated for each single wind farm and solar power plant and for the portfolio as a whole. The reduction in uncertainty (or risk) through bundling the wind farms and the solar power plants (the portfolio effect) is calculated by Markowitz' Modern Portfolio Theory. This theory is applied separately for the wind farm and the solar power plant bundle and for the combination of both. The combination of wind and photovoltaic assets clearly shows potential for a risk reduction. Even assets with a comparably low expected return can lead to a significant risk reduction depending on their individual characteristics.

  20. Efficient Data-Worth Analysis Using a Multilevel Monte Carlo Method Applied in Oil Reservoir Simulations

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Evans, K. J.

    2017-12-01

    Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.

  1. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  2. Uncertainty Analysis for the Evaluation of a Passive Runway Arresting System

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Marlowe, Jill M.; Yager, Thomas J.

    2009-01-01

    This paper considers the stopping distance of an aircraft involved in a runway overrun incident when the runway has been provided with an extension comprised of a material engineered to induce high levels of rolling friction and drag. A formula for stopping distance is derived that is shown to be the product of a known formula for the case of friction without drag, and a dimensionless constant between 0 and 1 that quantifies the further reduction in stopping distance when drag is introduced. This additional quantity, identified as the Drag Reduction Factor, D, is shown to depend on the ratio of drag force to friction force experienced by the aircraft as it enters the overrun area. The specific functional form of D is shown to depend on how drag varies with speed. A detailed uncertainty analysis is presented which reveals how the uncertainty in estimates of stopping distance are influenced by experimental error in the force measurements that are acquired in a typical evaluation experiment conducted to assess candidate overrun materials.

  3. On solar geoengineering and climate uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictionsmore » of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.« less

  4. Impact of a Ground Network of Miniaturized Laser Heterodyne Radiometers (mini-LHRs) on Global Carbon Flux Estimates

    NASA Astrophysics Data System (ADS)

    DiGregorio, A.; Wilson, E. L.; Palmer, P. I.; Mao, J.; Feng, L.

    2017-12-01

    We present the simulated impact of a small (50 instrument) ground network of NASA Goddard Space Flight Center's miniaturized laser heterodyne radiometer (mini-LHR), a small, low cost ( 50k), portable, and high precision CH4 and CO2 measuring instrument. Partnered with AERONET as a non-intrusive accessory, the mini-LHR is able to leverage the 500+ instrument AERONET network for rapid network deployment and testing, and simultaneously retrieve co-located aerosol data, an important input for sattelite measurements. This observing systems simulation experiment (OSSE) uses the 3-D GEOS-Chem chemistry transport model and 50 strategically selected sites to model flux estimate uncertainty reduction of both TCCON and mini-LHR instruments. We found that 50 mini-LHR sites are capable of improving global uncertainty by up to 70%, with local improvements in the Southern Hemisphere reaching to 90%. Our studies show that addition of the mini-LHR to current ground networks will play a major role in reduction of global carbon flux uncertainty.

  5. Robust H∞ control of active vehicle suspension under non-stationary running

    NASA Astrophysics Data System (ADS)

    Guo, Li-Xin; Zhang, Li-Ping

    2012-12-01

    Due to complexity of the controlled objects, the selection of control strategies and algorithms in vehicle control system designs is an important task. Moreover, the control problem of automobile active suspensions has been become one of the important relevant investigations due to the constrained peculiarity and parameter uncertainty of mathematical models. In this study, after establishing the non-stationary road surface excitation model, a study on the active suspension control for non-stationary running condition was conducted using robust H∞ control and linear matrix inequality optimization. The dynamic equation of a two-degree-of-freedom quarter car model with parameter uncertainty was derived. The H∞ state feedback control strategy with time-domain hard constraints was proposed, and then was used to design the active suspension control system of the quarter car model. Time-domain analysis and parameter robustness analysis were carried out to evaluate the proposed controller stability. Simulation results show that the proposed control strategy has high systemic stability on the condition of non-stationary running and parameter uncertainty (including suspension mass, suspension stiffness and tire stiffness). The proposed control strategy can achieve a promising improvement on ride comfort and satisfy the requirements of dynamic suspension deflection, dynamic tire loads and required control forces within given constraints, as well as non-stationary running condition.

  6. Sensitivity of land surface modeling to parameters: An uncertainty quantification method applied to the Community Land Model

    NASA Astrophysics Data System (ADS)

    Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes or both to determine the full range of sensitivity of Earth system modeling to land-surface parameters. This can facilitate sampling strategies in measurement campaigns targeted at reduction of climate modeling uncertainties and can also provide guidance on land parameter calibration for simulation optimization.

  7. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  8. Sense of control under uncertainty depends on people's childhood environment: a life history theory approach.

    PubMed

    Mittal, Chiraag; Griskevicius, Vladas

    2014-10-01

    Past research found that environmental uncertainty leads people to behave differently depending on their childhood environment. For example, economic uncertainty leads people from poor childhoods to become more impulsive while leading people from wealthy childhoods to become less impulsive. Drawing on life history theory, we examine the psychological mechanism driving such diverging responses to uncertainty. Five experiments show that uncertainty alters people's sense of control over the environment. Exposure to uncertainty led people from poorer childhoods to have a significantly lower sense of control than those from wealthier childhoods. In addition, perceptions of control statistically mediated the effect of uncertainty on impulsive behavior. These studies contribute by demonstrating that sense of control is a psychological driver of behaviors associated with fast and slow life history strategies. We discuss the implications of this for theory and future research, including that environmental uncertainty might lead people who grew up poor to quit challenging tasks sooner than people who grew up wealthy. 2014 APA, all rights reserved

  9. An ensemble-ANFIS based uncertainty assessment model for forecasting multi-scalar standardized precipitation index

    NASA Astrophysics Data System (ADS)

    Ali, Mumtaz; Deo, Ravinesh C.; Downs, Nathan J.; Maraseni, Tek

    2018-07-01

    Forecasting drought by means of the World Meteorological Organization-approved Standardized Precipitation Index (SPI) is considered to be a fundamental task to support socio-economic initiatives and effectively mitigating the climate-risk. This study aims to develop a robust drought modelling strategy to forecast multi-scalar SPI in drought-rich regions of Pakistan where statistically significant lagged combinations of antecedent SPI are used to forecast future SPI. With ensemble-Adaptive Neuro Fuzzy Inference System ('ensemble-ANFIS') executed via a 10-fold cross-validation procedure, a model is constructed by randomly partitioned input-target data. Resulting in 10-member ensemble-ANFIS outputs, judged by mean square error and correlation coefficient in the training period, the optimal forecasts are attained by the averaged simulations, and the model is benchmarked with M5 Model Tree and Minimax Probability Machine Regression (MPMR). The results show the proposed ensemble-ANFIS model's preciseness was notably better (in terms of the root mean square and mean absolute error including the Willmott's, Nash-Sutcliffe and Legates McCabe's index) for the 6- and 12- month compared to the 3-month forecasts as verified by the largest error proportions that registered in smallest error band. Applying 10-member simulations, ensemble-ANFIS model was validated for its ability to forecast severity (S), duration (D) and intensity (I) of drought (including the error bound). This enabled uncertainty between multi-models to be rationalized more efficiently, leading to a reduction in forecast error caused by stochasticity in drought behaviours. Through cross-validations at diverse sites, a geographic signature in modelled uncertainties was also calculated. Considering the superiority of ensemble-ANFIS approach and its ability to generate uncertainty-based information, the study advocates the versatility of a multi-model approach for drought-risk forecasting and its prime importance for estimating drought properties over confidence intervals to generate better information for strategic decision-making.

  10. Optimal control, investment and utilization schemes for energy storage under uncertainty

    NASA Astrophysics Data System (ADS)

    Mirhosseini, Niloufar Sadat

    Energy storage has the potential to offer new means for added flexibility on the electricity systems. This flexibility can be used in a number of ways, including adding value towards asset management, power quality and reliability, integration of renewable resources and energy bill savings for the end users. However, uncertainty about system states and volatility in system dynamics can complicate the question of when to invest in energy storage and how best to manage and utilize it. This work proposes models to address different problems associated with energy storage within a microgrid, including optimal control, investment, and utilization. Electric load, renewable resources output, storage technology cost and electricity day-ahead and spot prices are the factors that bring uncertainty to the problem. A number of analytical methodologies have been adopted to develop the aforementioned models. Model Predictive Control and discretized dynamic programming, along with a new decomposition algorithm are used to develop optimal control schemes for energy storage for two different levels of renewable penetration. Real option theory and Monte Carlo simulation, coupled with an optimal control approach, are used to obtain optimal incremental investment decisions, considering multiple sources of uncertainty. Two stage stochastic programming is used to develop a novel and holistic methodology, including utilization of energy storage within a microgrid, in order to optimally interact with energy market. Energy storage can contribute in terms of value generation and risk reduction for the microgrid. The integration of the models developed here are the basis for a framework which extends from long term investments in storage capacity to short term operational control (charge/discharge) of storage within a microgrid. In particular, the following practical goals are achieved: (i) optimal investment on storage capacity over time to maximize savings during normal and emergency operations; (ii) optimal market strategy of buy and sell over 24-hour periods; (iii) optimal storage charge and discharge in much shorter time intervals.

  11. What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California

    NASA Astrophysics Data System (ADS)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

    2013-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system can be made.

  12. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  13. A design methodology for nonlinear systems containing parameter uncertainty

    NASA Technical Reports Server (NTRS)

    Young, G. E.; Auslander, D. M.

    1983-01-01

    In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.

  14. Bioterrorism Preparedness in Public Health: Knowledge Needs for Robust Transformations

    ERIC Educational Resources Information Center

    Ipe, Minu

    2007-01-01

    The typical response of organizations dealing with external uncertainty is to develop strategies to adapt to the situation and focus on regaining a stable state. A crucial element of responding successfully to external uncertainties is to identify changes in knowledge needs within core organizational processes. This paper discusses the changing…

  15. Uncertainty in nutrient loads from tile drained landscapes: Effect of sampling frequency, calculation algorithm, and compositing strategies

    USDA-ARS?s Scientific Manuscript database

    Accurate estimates of annual nutrient loads are required to evaluate trends in water quality following changes in land use or management and to calibrate and validate water quality models. While much emphasis has been placed on understanding the uncertainty of watershed-scale nutrient load estimates...

  16. Menu Labeling as a Potential Strategy for Combating the Obesity Epidemic: A Health Impact Assessment

    PubMed Central

    Jarosz, Christopher J.; Simon, Paul; Fielding, Jonathan E.

    2009-01-01

    Objectives. We conducted a health impact assessment to quantify the potential impact of a state menu-labeling law on population weight gain in Los Angeles County, California. Methods. We utilized published and unpublished data to model consumer response to point-of-purchase calorie postings at large chain restaurants in Los Angeles County. We conducted sensitivity analyses to account for uncertainty in consumer response and in the total annual revenue, market share, and average meal price of large chain restaurants in the county. Results. Assuming that 10% of the restaurant patrons would order reduced-calorie meals in response to calorie postings, resulting in an average reduction of 100 calories per meal, we estimated that menu labeling would avert 40.6% of the 6.75 million pound average annual weight gain in the county population aged 5 years and older. Substantially larger impacts would be realized if higher percentages of patrons ordered reduced-calorie meals or if average per-meal calorie reductions increased. Conclusions. Our findings suggest that mandated menu labeling could have a sizable salutary impact on the obesity epidemic, even with only modest changes in consumer behavior. PMID:19608944

  17. Economic evaluation of neonatal care packages in a cluster-randomized controlled trial in Sylhet, Bangladesh.

    PubMed

    LeFevre, Amnesty E; Shillcutt, Samuel D; Waters, Hugh R; Haider, Sabbir; El Arifeen, Shams; Mannan, Ishtiaq; Seraji, Habibur R; Shah, Rasheduzzaman; Darmstadt, Gary L; Wall, Steve N; Williams, Emma K; Black, Robert E; Santosham, Mathuram; Baqui, Abdullah H

    2013-10-01

    To evaluate and compare the cost-effectiveness of two strategies for neonatal care in Sylhet division, Bangladesh. In a cluster-randomized controlled trial, two strategies for neonatal care--known as home care and community care--were compared with existing services. For each study arm, economic costs were estimated from a societal perspective, inclusive of programme costs, provider costs and household out-of-pocket payments on care-seeking. Neonatal mortality in each study arm was determined through household surveys. The incremental cost-effectiveness of each strategy--compared with that of the pre-existing levels of maternal and neonatal care--was then estimated. The levels of uncertainty in our estimates were quantified through probabilistic sensitivity analysis. The incremental programme costs of implementing the home-care package were 2939 (95% confidence interval, CI: 1833-7616) United States dollars (US$) per neonatal death averted and US$ 103.49 (95% CI: 64.72-265.93) per disability-adjusted life year (DALY) averted. The corresponding total societal costs were US$ 2971 (95% CI: 1844-7628) and US$ 104.62 (95% CI: 65.15-266.60), respectively. The home-care package was cost-effective--with 95% certainty--if healthy life years were valued above US$ 214 per DALY averted. In contrast, implementation of the community-care strategy led to no reduction in neonatal mortality and did not appear to be cost-effective. The home-care package represents a highly cost-effective intervention strategy that should be considered for replication and scale-up in Bangladesh and similar settings elsewhere.

  18. Cigarette Price Minimization Strategies in the United States: Price Reductions and Responsiveness to Excise Taxes

    PubMed Central

    2013-01-01

    Introduction: Because cigarette price minimization strategies can provide substantial price reductions for individuals continuing their usual smoking behaviors following federal and state cigarette excise tax increases, we examined independent price reductions compensating for overlapping strategies. The possible availability of larger independent price reduction opportunities in states with higher cigarette excise taxes is explored. Methods: Regression analysis used the 2006–2007 Tobacco Use Supplement of the Current Population Survey (N = 26,826) to explore national and state-level independent price reductions that smokers obtained from purchasing cigarettes (a) by the carton, (b) in a state with a lower average after-tax cigarette price than in the state of residence, and (c) in “some other way,” including online or in another country. Price reductions from these strategies are estimated jointly to compensate for known overlapping strategies. Results: Each strategy reduced the price of cigarettes by 64–94 cents per pack. These price reductions are 9%–22% lower than conventionally estimated results not compensating for overlapping strategies. Price reductions vary substantially by state. Following cigarette excise tax increases, the price reduction available from purchasing cigarettes by cartons increased. Additionally, the price reduction from purchasing cigarettes in a state with a lower average after-tax cigarette price is positively associated with state cigarette excise tax rates and border state cigarette excise tax rate differentials. Conclusions: Findings from this large, nationally representative study of cigarette smokers suggest that price reductions are larger in states with higher cigarette excise taxes, and increase as cigarette excise taxes rise. PMID:23729501

  19. Balancing certainty and uncertainty in clinical medicine.

    PubMed

    Hayward, Richard

    2006-01-01

    Nothing in clinical medicine is one hundred per cent certain. Part of a doctor's education involves learning how to cope with the anxiety that uncertainty in decisions affecting life and death inevitably produces. This paper examines: (1) the role of anxiety -- both rational and irrational -- in the provision of health care; (2) the effects of uncertainty upon the doctor-patient relationship; (3) the threat uncertainty poses to medical authority (and the assumption of infallibility that props it up); (4) the contribution of clinical uncertainty to the rising popularity of alternative therapies; and (5) the clash between the medical and the legal understanding of how certainty should be defined, particularly as it affects the paediatric community. It concludes by suggesting some strategies that might facilitate successful navigation between the opposing and ever-present forces of certainty and uncertainty.

  20. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  1. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay

    PubMed Central

    Oddo, Perry C.; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies. PMID:28350884

  2. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    PubMed

    Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  3. Essays on the comparison of climate change policies: Land use regulations, taxes, and tradable permits

    NASA Astrophysics Data System (ADS)

    Heres Del Valle, David R.

    The California Global Warming Solutions Act of 2006 requires year 2020 greenhouse gas (GHG) emissions in the state to be reduced back to 1990 levels. Several mitigation strategies have been explored and are expected to be implemented over the next few years. Among others, land use policies have been advocated as an important means to curb GHG emissions through the reduction of vehicle miles traveled (VMT), while an economy-wide cap and trade system would ensure that a certain level of GHG reductions is achieved although at unknown costs. The first essay of this dissertation aims to contribute to the ongoing discussion over the impact of land use policies by implementing a modified two-part model (M2PM) with instrumental variables (IV), a procedure that respectively takes into account the large mass of observations with zero car travel, and the possibility of residential self-selection, both of which could otherwise bias the estimates. The analysis takes advantage of a large dataset on travel patterns and socio-economic characteristics of more than 7,000 households across the 58 counties in the state of California. Results show that although VMT elasticities with respect to residential density are larger than others found in the recent econometric literature, the actual impact of residential density on VMT would not be as large unless very large increases in residential density occur. On the other hand, recent estimates of the elasticity of VMT with respect to the price of gasoline imply that moderate increases in the price of gasoline would suffice to reduce travel by similar magnitudes. The second essay reconsiders the debate over quantity (e.g., tradable permits) and price (e.g., taxes) controls by introducing uncertainty in the damage from the externality under a controlled environment. Economic theory predicts that quantity and price instruments for the control of externalities will produce identical outcomes as long as certain conditions obtain - namely negligible transaction costs and certainty about marginal control costs. This theoretical prediction explicitly renders irrelevant any uncertainties regarding the marginal damages in determining the market equilibrium outcome. Uncertainty about marginal damages may be important in practice, however, due to citizen participation in the permit market or to behavioral considerations. Through a laboratory experiment the instrument's equivalence is tested under different environments (including uncertainty about the marginal damages) that comply with the mentioned conditions. Results from the comparative analysis of a tax and a tradable permit system in a market composed of individuals with heterogeneous marginal abatement costs lend support to the equivalence of instruments.

  4. A safety rule approach to surveillance and eradication of biological invasions

    PubMed Central

    Haight, Robert G.; Koch, Frank H.; Venette, Robert; Studens, Kala; Fournier, Ronald E.; Swystun, Tom; Turgeon, Jean J.

    2017-01-01

    Uncertainty about future spread of invasive organisms hinders planning of effective response measures. We present a two-stage scenario optimization model that accounts for uncertainty about the spread of an invader, and determines survey and eradication strategies that minimize the expected program cost subject to a safety rule for eradication success. The safety rule includes a risk standard for the desired probability of eradication in each invasion scenario. Because the risk standard may not be attainable in every scenario, the safety rule defines a minimum proportion of scenarios with successful eradication. We apply the model to the problem of allocating resources to survey and eradicate the Asian longhorned beetle (ALB, Anoplophora glabripennis) after its discovery in the Greater Toronto Area, Ontario, Canada. We use historical data on ALB spread to generate a set of plausible invasion scenarios that characterizes the uncertainty of the beetle’s extent. We use these scenarios in the model to find survey and tree removal strategies that minimize the expected program cost while satisfying the safety rule. We also identify strategies that reduce the risk of very high program costs. Our results reveal two alternative strategies: (i) delimiting surveys and subsequent tree removal based on the surveys' outcomes, or (ii) preventive host tree removal without referring to delimiting surveys. The second strategy is more likely to meet the stated objectives when the capacity to detect an invader is low or the aspirations to eradicate it are high. Our results provide practical guidelines to identify the best management strategy given aspirational targets for eradication and spending. PMID:28759584

  5. A Conceptual Model for Developing Mindsets for Strategic Insight under Conditions of Complexity and High Uncertainty

    ERIC Educational Resources Information Center

    Yorks, Lyle; Nicolaides, Aliki

    2012-01-01

    This article addresses an important, yet often underattended to, aspect of the strategy development process: fostering the use of strategic learning practices in the simultaneous practice of developing strategy and cultivating strategic mindset awareness. The need for addressing this aspect of the strategy development process is increasingly…

  6. Spanish Interdisciplinary Committee for Cardiovascular Disease Prevention and the Spanish Society of Cardiology Position Statement on Dyslipidemia Management: differences between the European and American Guidelines.

    PubMed

    Lobos Bejarano, José María; Galve, Enrique; Royo-Bordonada, Miguel Ángel; Alegría Ezquerra, Eduardo; Armario, Pedro; Brotons Cuixart, Carlos; Camafort Babkowski, Miguel; Cordero Fort, Alberto; Maiques Galán, Antonio; Mantilla Morató, Teresa; Pérez Pérez, Antonio; Pedro-Botet, Juan; Villar Álvarez, Fernando; González-Juanatey, José Ramón

    2015-01-01

    The publication of the 2013 American College of Cardiology/American Heart Association guidelines on the treatment of high blood cholesterol has had a strong impact due to the paradigm shift in its recommendations. The Spanish Interdisciplinary Committee for Cardiovascular Disease Prevention and the Spanish Society of Cardiology reviewed this guideline and compared it with current European guidelines on cardiovascular prevention and dyslipidemia management. The most striking aspect of the American guideline is the elimination of the low-density lipoprotein cholesterol treat-to-target strategy and the adoption of a risk reduction strategy in 4 major statin benefit groups. In patients with established cardiovascular disease, both guidelines recommend a similar therapeutic strategy (high-dose potent statins). However, in primary prevention, the application of the American guidelines would substantially increase the number of persons, particularly older people, receiving statin therapy. The elimination of the cholesterol treat-to-target strategy, so strongly rooted in the scientific community, could have a negative impact on clinical practice, create a certain amount of confusion and uncertainty among professionals, and decrease follow-up and patient adherence. Thus, this article reaffirms the recommendations of the European guidelines. Although both guidelines have positive aspects, doubt remains regarding the concerns outlined above. In addition to using risk charts based on the native population, the messages of the European guideline are more appropriate to the Spanish setting and avoid the possible risk of overtreatment with statins in primary prevention.

  7. Potential Cardiovascular and Total Mortality Benefits of Air Pollution Control in Urban China.

    PubMed

    Huang, Chen; Moran, Andrew E; Coxson, Pamela G; Yang, Xueli; Liu, Fangchao; Cao, Jie; Chen, Kai; Wang, Miao; He, Jiang; Goldman, Lee; Zhao, Dong; Kinney, Patrick L; Gu, Dongfeng

    2017-10-24

    Outdoor air pollution ranks fourth among preventable causes of China's burden of disease. We hypothesized that the magnitude of health gains from air quality improvement in urban China could compare with achieving recommended blood pressure or smoking control goals. The Cardiovascular Disease Policy Model-China projected coronary heart disease, stroke, and all-cause deaths in urban Chinese adults 35 to 84 years of age from 2017 to 2030 if recent air quality (particulate matter with aerodynamic diameter ≤2.5 µm, PM 2.5 ) and traditional cardiovascular risk factor trends continue. We projected life-years gained if urban China were to reach 1 of 3 air quality goals: Beijing Olympic Games level (mean PM 2.5 , 55 μg/m 3 ), China Class II standard (35 μg/m 3 ), or World Health Organization standard (10 μg/m 3 ). We compared projected air pollution reduction control benefits with potential benefits of reaching World Health Organization hypertension and tobacco control goals. Mean PM 2.5 reduction to Beijing Olympic levels by 2030 would gain ≈241,000 (95% uncertainty interval, 189 000-293 000) life-years annually. Achieving either the China Class II or World Health Organization PM 2.5 standard would yield greater health benefits (992 000 [95% uncertainty interval, 790 000-1 180 000] or 1 827 000 [95% uncertainty interval, 1 481 00-2 129 000] annual life-years gained, respectively) than World Health Organization-recommended goals of 25% improvement in systolic hypertension control and 30% reduction in smoking combined (928 000 [95% uncertainty interval, 830 000-1 033 000] life-years). Air quality improvement in different scenarios could lead to graded health benefits ranging from 241 000 life-years gained to much greater benefits equal to or greater than the combined benefits of 25% improvement in systolic hypertension control and 30% smoking reduction. © 2017 American Heart Association, Inc.

  8. Bookending the Opportunity to Lower Wind’s LCOE by Reducing the Uncertainty Surrounding Annual Energy Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark

    Reducing the performance risk surrounding a wind project can potentially lead to a lower weighted-average cost of capital (WACC), and hence a lower levelized cost of energy (LCOE), through an advantageous shift in capital structure, and possibly also a reduction in the cost of capital. Specifically, a reduction in performance risk will move the 1-year P99 annual energy production (AEP) estimate closer to the P50 AEP estimate, which in turn reduces the minimum debt service coverage ratio (DSCR) required by lenders, thereby allowing the project to be financed with a greater proportion of low-cost debt. In addition, a reduction inmore » performance risk might also reduce the cost of one or more of the three sources of capital that are commonly used to finance wind projects: sponsor or cash equity, tax equity, and/or debt. Preliminary internal LBNL analysis of the maximum possible LCOE reduction attainable from reducing the performance risk of a wind project found a potentially significant opportunity for LCOE reduction of ~$10/MWh, by reducing the P50 DSCR to its theoretical minimum value of 1.0 (Bolinger 2015b, 2014) and by reducing the cost of sponsor equity and debt by one-third to one-half each (Bolinger 2015a, 2015b). However, with FY17 funding from the U.S. Department of Energy’s Atmosphere to Electrons (A2e) Performance Risk, Uncertainty, and Finance (PRUF) initiative, LBNL has been revisiting this “bookending” exercise in more depth, and now believes that its earlier preliminary assessment of the LCOE reduction opportunity was overstated. This reassessment is based on two new-found understandings: (1) Due to ever-present and largely irreducible inter-annual variability (IAV) in the wind resource, the minimum required DSCR cannot possibly fall to 1.0 (on a P50 basis), and (2) A reduction in AEP uncertainty will not necessarily lead to a reduction in the cost of capital, meaning that a shift in capital structure is perhaps the best that can be expected (perhaps along with a modest decline in the cost of cash equity as new investors enter the market).« less

  9. Gum-compliant uncertainty propagations for Pu and U concentration measurements using the 1st-prototype XOS/LANL hiRX instrument; an SRNL H-Canyon Test Bed performance evaluation project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Michael K.; O'Rourke, Patrick E.

    An SRNL H-Canyon Test Bed performance evaluation project was completed jointly by SRNL and LANL on a prototype monochromatic energy dispersive x-ray fluorescence instrument, the hiRX. A series of uncertainty propagations were generated based upon plutonium and uranium measurements performed using the alpha-prototype hiRX instrument. Data reduction and uncertainty modeling provided in this report were performed by the SRNL authors. Observations and lessons learned from this evaluation were also used to predict the expected uncertainties that should be achievable at multiple plutonium and uranium concentration levels provided instrument hardware and software upgrades being recommended by LANL and SRNL are performed.

  10. Climate change adaptation and Integrated Water Resource Management in the water sector

    NASA Astrophysics Data System (ADS)

    Ludwig, Fulco; van Slobbe, Erik; Cofino, Wim

    2014-10-01

    Integrated Water Resources Management (IWRM) was introduced in 1980s to better optimise water uses between different water demanding sectors. However, since it was introduced water systems have become more complicated due to changes in the global water cycle as a result of climate change. The realization that climate change will have a significant impact on water availability and flood risks has driven research and policy making on adaptation. This paper discusses the main similarities and differences between climate change adaptation and IWRM. The main difference between the two is the focus on current and historic issues of IWRM compared to the (long-term) future focus of adaptation. One of the main problems of implementing climate change adaptation is the large uncertainties in future projections. Two completely different approaches to adaptation have been developed in response to these large uncertainties. A top-down approach based on large scale biophysical impacts analyses focussing on quantifying and minimizing uncertainty by using a large range of scenarios and different climate and impact models. The main problem with this approach is the propagation of uncertainties within the modelling chain. The opposite is the bottom up approach which basically ignores uncertainty. It focusses on reducing vulnerabilities, often at local scale, by developing resilient water systems. Both these approaches however are unsuitable for integrating into water management. The bottom up approach focuses too much on socio-economic vulnerability and too little on developing (technical) solutions. The top-down approach often results in an “explosion” of uncertainty and therefore complicates decision making. A more promising direction of adaptation would be a risk based approach. Future research should further develop and test an approach which starts with developing adaptation strategies based on current and future risks. These strategies should then be evaluated using a range of future scenarios in order to develop robust adaptation measures and strategies.

  11. The Condition for Generous Trust

    PubMed Central

    Shinya, Obayashi; Yusuke, Inagaki; Hiroki, Takikawa

    2016-01-01

    Trust has been considered the “cement” of a society and is much studied in sociology and other social sciences. Most studies, however, have neglected one important aspect of trust: it involves an act of forgiving and showing tolerance toward another’s failure. In this study, we refer to this concept as “generous trust” and examine the conditions under which generous trust becomes a more viable option when compared to other types of trust. We investigate two settings. First, we introduce two types of uncertainties: uncertainty as to whether trustees have the intention to cooperate, and uncertainty as to whether trustees have enough competence to accomplish the entrusted tasks. Second, we examine the manner in which trust functions in a broader social context, one that involves matching and commitment processes. Since we expect generosity or forgiveness to work differently in the matching and commitment processes, we must differentiate trust strategies into generous trust in the matching process and that in the commitment process. Our analytical strategy is two-fold. First, we analyze the “modified” trust game that incorporates the two types of uncertainties without the matching process. This simplified setting enables us to derive mathematical results using game theory, thereby giving basic insight into the trust mechanism. Second, we investigate socially embedded trust relationships in contexts involving the matching and commitment processes, using agent-based simulation. Results show that uncertainty about partner’s intention and competence makes generous trust a viable option. In contrast, too much uncertainty undermines the possibility of generous trust. Furthermore, a strategy that is too generous cannot stand alone. Generosity should be accompanied with moderate punishment. As for socially embedded trust relationships, generosity functions differently in the matching process versus the commitment process. Indeed, these two types of generous trust coexist, and their coexistence enables a society to function well. PMID:27893759

  12. Multi-Species Inversion and IAGOS Airborne Data for a Better Constraint of Continental Scale Fluxes

    NASA Astrophysics Data System (ADS)

    Boschetti, F.; Gerbig, C.; Janssens-Maenhout, G. G. A.; Thouret, V.; Totsche, K. U.; Nedelec, P.; Marshall, J.

    2016-12-01

    Airborne measurements of CO2, CO, and CH4 in the context of IAGOS (In-service Aircraft for a Global Observing System) will provide profiles from take-off and landing of airliners. These observations are useful for constraining sources and sinks in the vicinity of major metropolitan areas. A proposed improvement of the top-down method to constrain sources and sinks is the use of a multispecies inversion. Different species such as CO2 and CO have partial overlapping in emission patterns for given fuel-combustion related sectors, and thus share part of the uncertainties, both related to the a priori knowledge of emissions, and to model-data mismatch error. Our approach employs a regional modeling framework that combines the Lagrangian particle dispersion model STILT with high resolution (10 km x 10 km) EDGARv4.3 emission inventory, differentiated by emission sector and fuel type for CO2, CO, and CH4, and combined with VPRM for biospheric fluxes of CO2. We validated the modeling framework with observations of CO profiles available through IAGOS. Using synthetic IAGOS profile observations, we evaluate the benefits using correlation between different species' uncertainties on the performance of the atmospheric inversion. With this approach we were able to reproduce CO observations with an average correlation of 0.56. Yet, simulated mixing where lower ratio by a factor of 2.3 reflecting a low bias in the emission inventory. Mean uncertainty reduction achieved for CO2 fossil fuel emissions amounts to 41%; for photosynthesis and respiration flux it is 41% and 45%, respectively. For CO and CH4 the uncertainty reduction is roughly 62% and 66% respectively. Considering correlation between different species, posterior uncertainty can be reduced up to 23%; such reduction depends on the assumed error structure of the prior and on the considered timeframe. The study suggests a significant constraint on regional emissions using multi-species inversions of IAGOS in-situ observations.

  13. Uncertainty analysis routine for the Ocean Thermal Energy Conversion (OTEC) biofouling measurement device and data reduction procedure. [HTCOEF code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, S.P.

    1978-03-01

    Biofouling and corrosion of heat exchanger surfaces in Ocean Thermal Energy Conversion (OTEC) systems may be controlling factors in the potential success of the OTEC concept. Very little is known about the nature and behavior of marine fouling films at sites potentially suitable for OTEC power plants. To facilitate the acquisition of needed data, a biofouling measurement device developed by Professor J. G. Fetkovich and his associates at Carnegie-Mellon University (CMU) has been mass produced for use by several organizations in experiments at a variety of ocean sites. The CMU device is designed to detect small changes in thermal resistancemore » associated with the formation of marine microfouling films. An account of the work performed at the Pacific Northwest Laboratory (PNL) to develop a computerized uncertainty analysis for estimating experimental uncertainties of results obtained with the CMU biofouling measurement device and data reduction scheme is presented. The analysis program was written as a subroutine to the CMU data reduction code and provides an alternative to the CMU procedure for estimating experimental errors. The PNL code was used to analyze sample data sets taken at Keahole Point, Hawaii; St. Croix, the Virgin Islands; and at a site in the Gulf of Mexico. The uncertainties of the experimental results were found to vary considerably with the conditions under which the data were taken. For example, uncertainties of fouling factors (where fouling factor is defined as the thermal resistance of the biofouling layer) estimated from data taken on a submerged buoy at Keahole Point, Hawaii were found to be consistently within 0.00006 hr-ft/sup 2/-/sup 0/F/Btu, while corresponding values for data taken on a tugboat in the Gulf of Mexico ranged up to 0.0010 hr-ft/sup 2/-/sup 0/F/Btu. Reasons for these differences are discussed.« less

  14. Matrix approach to uncertainty assessment and reduction for modeling terrestrial carbon cycle

    NASA Astrophysics Data System (ADS)

    Luo, Y.; Xia, J.; Ahlström, A.; Zhou, S.; Huang, Y.; Shi, Z.; Wang, Y.; Du, Z.; Lu, X.

    2017-12-01

    Terrestrial ecosystems absorb approximately 30% of the anthropogenic carbon dioxide emissions. This estimate has been deduced indirectly: combining analyses of atmospheric carbon dioxide concentrations with ocean observations to infer the net terrestrial carbon flux. In contrast, when knowledge about the terrestrial carbon cycle is integrated into different terrestrial carbon models they make widely different predictions. To improve the terrestrial carbon models, we have recently developed a matrix approach to uncertainty assessment and reduction. Specifically, the terrestrial carbon cycle has been commonly represented by a series of carbon balance equations to track carbon influxes into and effluxes out of individual pools in earth system models. This representation matches our understanding of carbon cycle processes well and can be reorganized into one matrix equation without changing any modeled carbon cycle processes and mechanisms. We have developed matrix equations of several global land C cycle models, including CLM3.5, 4.0 and 4.5, CABLE, LPJ-GUESS, and ORCHIDEE. Indeed, the matrix equation is generic and can be applied to other land carbon models. This matrix approach offers a suite of new diagnostic tools, such as the 3-dimensional (3-D) parameter space, traceability analysis, and variance decomposition, for uncertainty analysis. For example, predictions of carbon dynamics with complex land models can be placed in a 3-D parameter space (carbon input, residence time, and storage potential) as a common metric to measure how much model predictions are different. The latter can be traced to its source components by decomposing model predictions to a hierarchy of traceable components. Then, variance decomposition can help attribute the spread in predictions among multiple models to precisely identify sources of uncertainty. The highly uncertain components can be constrained by data as the matrix equation makes data assimilation computationally possible. We will illustrate various applications of this matrix approach to uncertainty assessment and reduction for terrestrial carbon cycle models.

  15. Towards Robust Energy Systems Modeling: Examinging Uncertainty in Fossil Fuel-Based Life Cycle Assessment Approaches

    NASA Astrophysics Data System (ADS)

    Venkatesh, Aranya

    Increasing concerns about the environmental impacts of fossil fuels used in the U.S. transportation and electricity sectors have spurred interest in alternate energy sources, such as natural gas and biofuels. Life cycle assessment (LCA) methods can be used to estimate the environmental impacts of incumbent energy sources and potential impact reductions achievable through the use of alternate energy sources. Some recent U.S. climate policies have used the results of LCAs to encourage the use of low carbon fuels to meet future energy demands in the U.S. However, the LCA methods used to estimate potential reductions in environmental impact have some drawbacks. First, the LCAs are predominantly based on deterministic approaches that do not account for any uncertainty inherent in life cycle data and methods. Such methods overstate the accuracy of the point estimate results, which could in turn lead to incorrect and (consequent) expensive decision-making. Second, system boundaries considered by most LCA studies tend to be limited (considered a manifestation of uncertainty in LCA). Although LCAs can estimate the benefits of transitioning to energy systems of lower environmental impact, they may not be able to characterize real world systems perfectly. Improved modeling of energy systems mechanisms can provide more accurate representations of reality and define more likely limits on potential environmental impact reductions. This dissertation quantitatively and qualitatively examines the limitations in LCA studies outlined previously. The first three research chapters address the uncertainty in life cycle greenhouse gas (GHG) emissions associated with petroleum-based fuels, natural gas and coal consumed in the U.S. The uncertainty in life cycle GHG emissions from fossil fuels was found to range between 13 and 18% of their respective mean values. For instance, the 90% confidence interval of the life cycle GHG emissions of average natural gas consumed in the U.S was found to range between -8 to 9% (17%) of the mean value of 66 g CO2e/MJ. Results indicate that uncertainty affects the conclusions of comparative life cycle assessments, especially when differences in average environmental impacts between two competing fuels/products are small. In the final two research chapters of this thesis, system boundary limitations in LCA are addressed. Simplified economic dispatch models for are developed to examine changes in regional power plant dispatch that occur when coal power plants are retired and when natural gas prices drop. These models better reflect reality by estimating the order in which existing power plants are dispatched to meet electricity demand based on short-run marginal costs. Results indicate that the reduction in air emissions are lower than suggested by LCA studies, since they generally do not include the complexity of regional electricity grids, predominantly driven by comparative fuel prices. For instance, comparison, this study estimates 7-15% reductions in emissions with low natural gas prices. Although this is a significant reduction in itself, it is still lower than the benefits reported in traditional life cycle comparisons of coal and natural gas-based power (close to 50%), mainly due to the effects of plant dispatch.

  16. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  17. Sacubitril/Valsartan: From Clinical Trials to Real-world Experience.

    PubMed

    Joly, Joanna M; Desai, Akshay S

    2018-04-23

    Compared to enalapril, use of angiotensin-receptor blocker and neprilysin inhibitor sacubitril/valsartan to treat patients with heart failure and reduced ejection fraction (HFrEF) is associated with substantial reductions in both cardiovascular mortality and heart failure progression. The purpose of this review is to discuss the real-world experience of sacubitril/valsartan. In the years following the publication of the landmark PARADIGM-HF trial in 2014 and its subsequent FDA approval, a growing evidence base supports the safety and efficacy of sacubitril/valsartan in a broad spectrum of patients with HFrEF. Updated clinical practice guidelines have embraced the use of sacubitril/valsartan in preference to ACE inhibitors or ARBs in selected patients. In this review, we highlight the clinical trials that led to these key updates to clinical guidelines, offer practical strategies for patient selection and utilization in clinical practice, and identify important areas of uncertainty that require future research.

  18. Slower reacquisition after partial extinction in human contingency learning.

    PubMed

    Morís, Joaquín; Barberia, Itxaso; Vadillo, Miguel A; Andrades, Ainhoa; López, Francisco J

    2017-01-01

    Extinction is a very relevant learning phenomenon from a theoretical and applied point of view. One of its most relevant features is that relapse phenomena often take place once the extinction training has been completed. Accordingly, as extinction-based therapies constitute the most widespread empirically validated treatment of anxiety disorders, one of their most important limitations is this potential relapse. We provide the first demonstration of relapse reduction in human contingency learning using mild aversive stimuli. This effect was found after partial extinction (i.e., reinforced trials were occasionally experienced during extinction, Experiment 1) and progressive extinction treatments (Experiment 3), and it was not only because of differences in uncertainty levels between the partial and a standard extinction group (Experiment 2). The theoretical explanation of these results, the potential uses of this strategy in applied situations, and its current limitations are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Under-sampling trajectory design for compressed sensing based DCE-MRI.

    PubMed

    Liu, Duan-duan; Liang, Dong; Zhang, Na; Liu, Xin; Zhang, Yuan-ting

    2013-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) needs high temporal and spatial resolution to accurately estimate quantitative parameters and characterize tumor vasculature. Compressed Sensing (CS) has the potential to accomplish this mutual importance. However, the randomness in CS under-sampling trajectory designed using the traditional variable density (VD) scheme may translate to uncertainty in kinetic parameter estimation when high reduction factors are used. Therefore, accurate parameter estimation using VD scheme usually needs multiple adjustments on parameters of Probability Density Function (PDF), and multiple reconstructions even with fixed PDF, which is inapplicable for DCE-MRI. In this paper, an under-sampling trajectory design which is robust to the change on PDF parameters and randomness with fixed PDF is studied. The strategy is to adaptively segment k-space into low-and high frequency domain, and only apply VD scheme in high-frequency domain. Simulation results demonstrate high accuracy and robustness comparing to VD design.

  20. Climate trends and projections for the Andean Altiplano and strategies for adaptation

    NASA Astrophysics Data System (ADS)

    Valdivia, C.; Thibeault, J.; Gilles, J. L.; García, M.; Seth, A.

    2013-04-01

    Climate variability and change impact production in rainfed agricultural systems of the Bolivian highlands. Maximum temperature trends are increasing for the Altiplano. Minimum temperature increases are significant in the northern region, and decreases are significant in the southern region. Producers' perceptions of climate hazards are high in the central region, while concerns with changing climate and unemployment are high in the north. Similar high-risk perceptions involve pests and diseases in both regions. Altiplano climate projections for end-of-century highlights include increases in temperature, extreme event frequency, change in the timing of rainfall, and reduction of soil humidity. Successful adaptation to these changes will require the development of links between the knowledge systems of producers and scientists. Two-way participatory approaches to develop capacity and information that involve decision makers and scientists are appropriate approaches in this context of increased risk, uncertainty and vulnerability.

  1. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  2. Uncertainty management by relaxation of conflicting constraints in production process scheduling

    NASA Technical Reports Server (NTRS)

    Dorn, Juergen; Slany, Wolfgang; Stary, Christian

    1992-01-01

    Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.

  3. Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study

    USGS Publications Warehouse

    Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,

    2005-01-01

    Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.

  4. Adaptive Flood Risk Management Under Climate Change Uncertainty Using Real Options and Optimization.

    PubMed

    Woodward, Michelle; Kapelan, Zoran; Gouldby, Ben

    2014-01-01

    It is well recognized that adaptive and flexible flood risk strategies are required to account for future uncertainties. Development of such strategies is, however, a challenge. Climate change alone is a significant complication, but, in addition, complexities exist trying to identify the most appropriate set of mitigation measures, or interventions. There are a range of economic and environmental performance measures that require consideration, and the spatial and temporal aspects of evaluating the performance of these is complex. All these elements pose severe difficulties to decisionmakers. This article describes a decision support methodology that has the capability to assess the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. The flood risk strategies have been explicitly designed to allow for flexible adaptive measures by capturing the concepts of real options and multiobjective optimization to evaluate potential flood risk management opportunities. A state-of-the-art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time and a multiobjective genetic algorithm is utilized to search for the optimal adaptive strategies. The modeling system has been applied to a reach on the Thames Estuary (London, England), and initial results show the inclusion of flexibility is advantageous, while the outputs provide decisionmakers with supplementary knowledge that previously has not been considered. © 2013 HR Wallingford Ltd.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Z.; Liu, C.; Botterud, A.

    Renewable energy resources have been rapidly integrated into power systems in many parts of the world, contributing to a cleaner and more sustainable supply of electricity. Wind and solar resources also introduce new challenges for system operations and planning in terms of economics and reliability because of their variability and uncertainty. Operational strategies based on stochastic optimization have been developed recently to address these challenges. In general terms, these stochastic strategies either embed uncertainties into the scheduling formulations (e.g., the unit commitment [UC] problem) in probabilistic forms or develop more appropriate operating reserve strategies to take advantage of advanced forecastingmore » techniques. Other approaches to address uncertainty are also proposed, where operational feasibility is ensured within an uncertainty set of forecasting intervals. In this report, a comprehensive review is conducted to present the state of the art through Spring 2015 in the area of stochastic methods applied to power system operations with high penetration of renewable energy. Chapters 1 and 2 give a brief introduction and overview of power system and electricity market operations, as well as the impact of renewable energy and how this impact is typically considered in modeling tools. Chapter 3 reviews relevant literature on operating reserves and specifically probabilistic methods to estimate the need for system reserve requirements. Chapter 4 looks at stochastic programming formulations of the UC and economic dispatch (ED) problems, highlighting benefits reported in the literature as well as recent industry developments. Chapter 5 briefly introduces alternative formulations of UC under uncertainty, such as robust, chance-constrained, and interval programming. Finally, in Chapter 6, we conclude with the main observations from our review and important directions for future work.« less

  6. A methodology for obtaining on-orbit SI-traceable spectral radiance measurements in the thermal infrared

    NASA Astrophysics Data System (ADS)

    Dykema, John A.; Anderson, James G.

    2006-06-01

    A methodology to achieve spectral thermal radiance measurements from space with demonstrable on-orbit traceability to the International System of Units (SI) is described. This technique results in measurements of infrared spectral radiance R(\\tilde {\\upsilon }) , with spectral index \\tilde {\\upsilon } in cm-1, with a relative combined uncertainty u_c[R(\\tilde {\\upsilon })] of 0.0015 (k = 1) for the average mid-infrared radiance emitted by the Earth. This combined uncertainty, expressed in brightness temperature units, is equivalent to ±0.1 K at 250 K at 750 cm-1. This measurement goal is achieved by utilizing a new method for infrared scale realization combined with an instrument design optimized to minimize component uncertainties and admit tests of radiometric performance. The SI traceability of the instrument scale is established by evaluation against source-based and detector-based infrared scales in defined laboratory protocols before launch. A novel strategy is executed to ensure fidelity of on-orbit calibration to the pre-launch scale. This strategy for on-orbit validation relies on the overdetermination of instrument calibration. The pre-launch calibration against scales derived from physically independent paths to the base SI units provides the foundation for a critical analysis of the overdetermined on-orbit calibration to establish an SI-traceable estimate of the combined measurement uncertainty. Redundant calibration sources and built-in diagnostic tests to assess component measurement uncertainties verify the SI traceability of the instrument calibration over the mission lifetime. This measurement strategy can be realized by a practical instrument, a prototype Fourier-transform spectrometer under development for deployment on a small satellite. The measurement record resulting from the methodology described here meets the observational requirements for climate monitoring and climate model testing and improvement.

  7. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  8. Operational strategies of anti-malarial drug campaigns for malaria elimination in Zambia's southern province: a simulation study.

    PubMed

    Stuckey, Erin M; Miller, John M; Littrell, Megan; Chitnis, Nakul; Steketee, Rick

    2016-03-09

    Malaria elimination requires reducing both the potential of mosquitoes to transmit parasites to humans and humans to transmit parasites to mosquitoes. To achieve this goal in Southern province, Zambia a mass test and treat (MTAT) campaign was conducted from 2011-2013 to complement high coverage of long-lasting insecticide-treated nets (LLIN). To identify factors likely to increase campaign effectiveness, a modelling approach was applied to investigate the simulated effect of alternative operational strategies for parasite clearance in southern province. OpenMalaria, a discrete-time, individual-based stochastic model of malaria, was parameterized for the study area to simulate anti-malarial drug administration for interruption of transmission. Simulations were run for scenarios with a range of artemisinin-combination therapies, proportion of the population reached by the campaign, targeted age groups, time between campaign rounds, Plasmodium falciparum test protocols, and the addition of drugs aimed at preventing onward transmission. A sensitivity analysis was conducted to assess uncertainty of simulation results. Scenarios were evaluated based on the reduction in all-age parasite prevalence during the peak transmission month one year following the campaign, compared to the currently-implemented strategy of MTAT 19 % population coverage at pilot and 40 % coverage during the first year of implementation in the presence of 56 % LLIN use and 18 % indoor residual spray coverage. Simulation results suggest the most important determinant of success in reducing prevalence is the population coverage achieved in the campaign, which would require more than 1 year of campaign implementation for elimination. The inclusion of single low-dose primaquine, which acts as a gametocytocide, or ivermectin, which acts as an endectocide, to the drug regimen did not further reduce parasite prevalence one year following the campaign compared to the currently-implemented strategy. Simulation results indicate a high proportion of low-density infections were missed by rapid diagnostic tests that would be treated and cleared with mass drug administration (MDA). The optimal implementation strategy for MTAT or MDA will vary by background level of prevalence, by rate of infections imported to the area, and by ability to operationally achieve high population coverage. Overall success with new parasite clearance strategies depends on continued coverage of vector control interventions to ensure sustained gains in reduction of disease burden.

  9. Algorithms and analyses for stochastic optimization for turbofan noise reduction using parallel reduced-order modeling

    NASA Astrophysics Data System (ADS)

    Yang, Huanhuan; Gunzburger, Max

    2017-06-01

    Simulation-based optimization of acoustic liner design in a turbofan engine nacelle for noise reduction purposes can dramatically reduce the cost and time needed for experimental designs. Because uncertainties are inevitable in the design process, a stochastic optimization algorithm is posed based on the conditional value-at-risk measure so that an ideal acoustic liner impedance is determined that is robust in the presence of uncertainties. A parallel reduced-order modeling framework is developed that dramatically improves the computational efficiency of the stochastic optimization solver for a realistic nacelle geometry. The reduced stochastic optimization solver takes less than 500 seconds to execute. In addition, well-posedness and finite element error analyses of the state system and optimization problem are provided.

  10. Forest processes from stands to landscapes: exploring model forecast uncertainties using cross-scale model comparison

    Treesearch

    Michael J. Papaik; Andrew Fall; Brian Sturtevant; Daniel Kneeshaw; Christian Messier; Marie-Josee Fortin; Neal Simon

    2010-01-01

    Forest management practices conducted primarily at the stand scale result in simplified forests with regeneration problems and low structural and biological diversity. Landscape models have been used to help design management strategies to address these problems. However, there remains a great deal of uncertainty that the actual management practices result in the...

  11. A safety rule approach to surveillance and eradication of biological invasions

    Treesearch

    Denys Yemshanov; Robert G. Haight; Frank H. Koch; Robert Venette; Kala Studens; Ronald E. Fournier; Tom Swystun; Jean J. Turgeon; Yulin Gao

    2017-01-01

    Uncertainty about future spread of invasive organisms hinders planning of effective response measures. We present a two-stage scenario optimization model that accounts for uncertainty about the spread of an invader, and determines survey and eradication strategies that minimize the expected program cost subject to a safety rule for eradication success. The safety rule...

  12. Modifying climate change habitat models using tree species-specific assessments of model uncertainty and life history-factors

    Treesearch

    Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald

    2011-01-01

    Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...

  13. A data mining approach to predict in situ chlorinated ethene detoxification potential

    NASA Astrophysics Data System (ADS)

    Lee, J.; Im, J.; Kim, U.; Loeffler, F. E.

    2015-12-01

    Despite major advances in physicochemical remediation technologies, in situ biostimulation and bioaugmentation treatment aimed at stimulating Dehalococcoides mccartyi (Dhc) reductive dechlorination activity remains a cornerstone approach to remedy sites impacted with chlorinated ethenes. In practice, selecting the best remedial strategy is challenging due to uncertainties associated with the microbiology (e.g., presence and activity of Dhc) and geochemical factors influencing Dhc activity. Extensive groundwater datasets collected over decades of monitoring exist, but have not been systematically analyzed. In the present study, geochemical and microbial data sets collected from 35 wells at 5 contaminated sites were used to develop a predictive empirical model using a machine learning algorithm (i) to rank the relative importance of parameters that affect in situ reductive dechlorination potential, and (ii) to provide recommendations for selecting the optimal remediation strategy at a specific site. Classification and regression tree (CART) analysis was applied, and a representative classification tree model was developed that allowed short-term prediction of dechlorination potential. Indirect indicators for low dissolved oxygen (e.g., low NO3-and NO2-, high Fe2+ and CH4) were the most influential factors for predicting dechlorination potential, followed by total organic carbon content (TOC) and Dhc cell abundance. These findings indicate that machine learning-based data mining techniques applied to groundwater monitoring data can lead to the development of predictive groundwater remediation models. A major need for improving the predictive capabilities of the data mining approach is a curated, up-to-date and comprehensive collection of groundwater monitoring data.

  14. Osteoporosis: the emperor has no clothes

    PubMed Central

    Järvinen, T L N; Michaëlsson, K; Aspenberg, P; Sievänen, H

    2015-01-01

    Current prevention strategies for low-trauma fractures amongst older persons depend on the notions that fractures are mainly caused by osteoporosis (pathophysiology), that patients at high risk can be identified (screening) and that the risk is amenable to bone-targeted pharmacotherapy (treatment). However, all these three notions can be disputed. Pathophysiology Most fracture patients have fallen, but actually do not have osteoporosis. A high likelihood of falling, in turn, is attributable to an ageing-related decline in physical functioning and general frailty. Screening Currently available fracture risk prediction strategies including bone densitometry and multifactorial prediction tools are unable to identify a large proportion of patients who will sustain a fracture, whereas many of those with a high fracture risk score will not sustain a fracture. Treatment The evidence for the viability of bone-targeted pharmacotherapy in preventing hip fracture and other clinical fragility fractures is mainly limited to women aged 65–80 years with osteoporosis, whereas the proof of hip fracture-preventing efficacy in women over 80 years of age and in men at all ages is meagre or absent. Further, the antihip fracture efficacy shown in clinical trials is absent in real-life studies. Many drugs for the treatment of osteoporosis have also been associated with increased risks of serious adverse events. There are also considerable uncertainties related to the efficacy of drug therapy in preventing clinical vertebral fractures, whereas the efficacy for preventing other fractures (relative risk reductions of 20–25%) remains moderate, particularly in terms of the low absolute risk reduction in fractures with this treatment. PMID:25809279

  15. Irreducible Uncertainty in Terrestrial Carbon Projections

    NASA Astrophysics Data System (ADS)

    Lovenduski, N. S.; Bonan, G. B.

    2016-12-01

    We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.

  16. Unveiling the decoherence effect of noise on the entropic uncertainty relation and its control by partially collapsed operations

    NASA Astrophysics Data System (ADS)

    Chen, Min-Nan; Sun, Wen-Yang; Huang, Ai-Jun; Ming, Fei; Wang, Dong; Ye, Liu

    2018-01-01

    In this work, we investigate the dynamics of quantum-memory-assisted entropic uncertainty relations under open systems, and how to steer the uncertainty under different types of decoherence. Specifically, we develop the dynamical behaviors of the uncertainty of interest under two typical categories of noise; bit flipping and depolarizing channels. It has been shown that the measurement uncertainty firstly increases and then decreases with the growth of the decoherence strength in bit flipping channels. In contrast, the uncertainty monotonically increases with the increase of the decoherence strength in depolarizing channels. Notably, and to a large degree, it is shown that the uncertainty depends on both the systematic quantum correlation and the minimal conditional entropy of the observed subsystem. Moreover, we present a possible physical interpretation for these distinctive behaviors of the uncertainty within such scenarios. Furthermore, we propose a simple and effective strategy to reduce the entropic uncertainty by means of a partially collapsed operation—quantum weak measurement. Therefore, our investigations might offer an insight into the dynamics of the measurment uncertainty under decoherence, and be of importance to quantum precision measurement in open systems.

  17. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    NASA Astrophysics Data System (ADS)

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  18. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  19. Silvicultural decisionmaking in an uncertain climate future: a workshop-based exploration of considerations, strategies, and approaches

    Treesearch

    Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Christopher R. Webster; Brian J. Palik; Mark J. Twery; John B. Bradford; Linda R. Parker; Andrea T. Hille; Sheela M. Johnson

    2011-01-01

    Land managers across the country face the immense challenge of developing and applying appropriate management strategies as forests respond to climate change. We hosted a workshop to explore silvicultural strategies for addressing the uncertainties surrounding climate change and forest response in the northeastern and north-central United States. Outcomes of this...

  20. Public health co-benefits of greenhouse gas emissions reduction: A systematic review.

    PubMed

    Gao, Jinghong; Kovats, Sari; Vardoulakis, Sotiris; Wilkinson, Paul; Woodward, Alistair; Li, Jing; Gu, Shaohua; Liu, Xiaobo; Wu, Haixia; Wang, Jun; Song, Xiaoqin; Zhai, Yunkai; Zhao, Jie; Liu, Qiyong

    2018-06-15

    Public health co-benefits from curbing climate change can make greenhouse gas (GHG) mitigation strategies more attractive and increase their implementation. The purpose of this systematic review is to summarize the evidence of these health co-benefits to improve our understanding of the mitigation measures involved, potential mechanisms, and relevant uncertainties. A comprehensive search for peer-reviewed studies published in English was conducted using the primary electronic databases. Reference lists from these articles were reviewed and manual searches were performed to supplement relevant studies. The identified records were screened based on inclusion criteria. We extracted data from the final retrieved papers using a pre-designed data extraction form and a quality assessment was conducted. The studies were heterogeneities, so meta-analysis was not possible and instead evidence was synthesized using narrative summaries. Thirty-six studies were identified. We identified GHG mitigation strategies in five domains - energy generation, transportation, food and agriculture, households, and industry and economy - which usually, although not always, bring co-benefits for public health. These health gains are likely to be multiplied by comprehensive measures that include more than one sectors. GHG mitigation strategies can bring about substantial and possibly cost-effective public health co-benefits. These findings are highly relevant to policy makers and other stakeholders since they point to the compounding value of taking concerted action against climate change and air pollution. Copyright © 2018. Published by Elsevier B.V.

  1. Probabilistic objective functions for margin-less IMRT planning

    NASA Astrophysics Data System (ADS)

    Bohoslavsky, Román; Witte, Marnix G.; Janssen, Tomas M.; van Herk, Marcel

    2013-06-01

    We present a method to implement probabilistic treatment planning of intensity-modulated radiation therapy using custom software plugins in a commercial treatment planning system. Our method avoids the definition of safety-margins by directly including the effect of geometrical uncertainties during optimization when objective functions are evaluated. Because the shape of the resulting dose distribution implicitly defines the robustness of the plan, the optimizer has much more flexibility than with a margin-based approach. We expect that this added flexibility helps to automatically strike a better balance between target coverage and dose reduction for surrounding healthy tissue, especially for cases where the planning target volume overlaps organs at risk. Prostate cancer treatment planning was chosen to develop our method, including a novel technique to include rotational uncertainties. Based on population statistics, translations and rotations are simulated independently following a marker-based IGRT correction strategy. The effects of random and systematic errors are incorporated by first blurring and then shifting the dose distribution with respect to the clinical target volume. For simplicity and efficiency, dose-shift invariance and a rigid-body approximation are assumed. Three prostate cases were replanned using our probabilistic objective functions. To compare clinical and probabilistic plans, an evaluation tool was used that explicitly incorporates geometric uncertainties using Monte-Carlo methods. The new plans achieved similar or better dose distributions than the original clinical plans in terms of expected target coverage and rectum wall sparing. Plan optimization times were only about a factor of two higher than in the original clinical system. In conclusion, we have developed a practical planning tool that enables margin-less probability-based treatment planning with acceptable planning times, achieving the first system that is feasible for clinical implementation.

  2. Probabilistic human health risk assessment of degradation-related chemical mixtures in heterogeneous aquifers: Risk statistics, hot spots, and preferential channels

    NASA Astrophysics Data System (ADS)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2015-06-01

    The increasing presence of toxic chemicals released in the subsurface has led to a rapid growth of social concerns and the need to develop and employ models that can predict the impact of groundwater contamination on human health risk under uncertainty. Monitored natural attenuation is a common remediation action in many contamination cases. However, natural attenuation can lead to the production of daughter species of distinct toxicity that may pose challenges in pollution management strategies. The actual threat that these contaminants pose to human health depends on the interplay between the complex structure of the geological media and the toxicity of each pollutant byproduct. This work addresses human health risk for chemical mixtures resulting from the sequential degradation of a contaminant (such as a chlorinated solvent) under uncertainty through high-resolution three-dimensional numerical simulations. We systematically investigate the interaction between aquifer heterogeneity, flow connectivity, contaminant injection model, and chemical toxicity in the probabilistic characterization of health risk. We illustrate how chemical-specific travel times control the regime of the expected risk and its corresponding uncertainties. Results indicate conditions where preferential flow paths can favor the reduction of the overall risk of the chemical mixture. The overall human risk response to aquifer connectivity is shown to be nontrivial for multispecies transport. This nontriviality is a result of the interaction between aquifer heterogeneity and chemical toxicity. To quantify the joint effect of connectivity and toxicity in health risk, we propose a toxicity-based Damköhler number. Furthermore, we provide a statistical characterization in terms of low-order moments and the probability density function of the individual and total risks.

  3. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-06-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs representing conditions as they occurred during August through September 2006, and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between 2, 4 and 12 km resolution runs, but 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements of the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2 and 4 km resolution. On average, when modeling at 36 km resolution, 7 deaths per ozone month were avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-9). When modeling at 2, 4 or 12 km finer scale resolution, on average 5 deaths were avoided due to the same reductions (95% confidence interval was 2-7). Initial results for this specific region show that modeling at a resolution finer than 12 km is unlikely to improve uncertainty in benefits analysis. We suggest that 12 km resolution may be appropriate for uncertainty analyses in areas with similar chemistry, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  4. Steering the measured uncertainty under decoherence through local PT -symmetric operations

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu

    2018-07-01

    The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.

  5. Managing Wind Power Uncertainty Through Strategic Reserve Purchasing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Ershun; Zhang, Ning; Kang, Chongqing

    With the rapidly increasing penetration of wind power, wind producers are becoming increasingly responsible for the deviation of the wind power output from the forecast. Such uncertainty results in revenue losses to the wind power producers (WPPs) due to penalties in ex-post imbalance settlements. This paper explores the opportunities available for WPPs if they can purchase or schedule some reserves to offset part of their deviation rather than being fully penalized in the real time market. The revenue for WPPs under such mechanism is modeled. The optimal strategy for managing the uncertainty of wind power by purchasing reserves to maximizemore » the WPP's revenue is analytically derived with rigorous optimality conditions. The amount of energy and reserves that should be bid in the market are explicitly quantified by the probabilistic forecast and the prices of the energy and reserves. A case study using the price data from ERCOT and wind power data from NREL is performed to verify the effectiveness of the derived optimal bidding strategy and the benefits of reserve purchasing. Additionally, the proposed bidding strategy can also reduce the risk of variations on WPP's revenue.« less

  6. Optimal robust control strategy of a solid oxide fuel cell system

    NASA Astrophysics Data System (ADS)

    Wu, Xiaojuan; Gao, Danhui

    2018-01-01

    Optimal control can ensure system safe operation with a high efficiency. However, only a few papers discuss optimal control strategies for solid oxide fuel cell (SOFC) systems. Moreover, the existed methods ignore the impact of parameter uncertainty on system instantaneous performance. In real SOFC systems, several parameters may vary with the variation of operation conditions and can not be identified exactly, such as load current. Therefore, a robust optimal control strategy is proposed, which involves three parts: a SOFC model with parameter uncertainty, a robust optimizer and robust controllers. During the model building process, boundaries of the uncertain parameter are extracted based on Monte Carlo algorithm. To achieve the maximum efficiency, a two-space particle swarm optimization approach is employed to obtain optimal operating points, which are used as the set points of the controllers. To ensure the SOFC safe operation, two feed-forward controllers and a higher-order robust sliding mode controller are presented to control fuel utilization ratio, air excess ratio and stack temperature afterwards. The results show the proposed optimal robust control method can maintain the SOFC system safe operation with a maximum efficiency under load and uncertainty variations.

  7. Decision-support tool for assessing biomanufacturing strategies under uncertainty: stainless steel versus disposable equipment for clinical trial material preparation.

    PubMed

    Farid, Suzanne S; Washbrook, John; Titchener-Hooker, Nigel J

    2005-01-01

    This paper presents the application of a decision-support tool, SIMBIOPHARMA, for assessing different manufacturing strategies under uncertainty for the production of biopharmaceuticals. SIMBIOPHARMA captures both the technical and business aspects of biopharmaceutical manufacture within a single tool that permits manufacturing alternatives to be evaluated in terms of cost, time, yield, project throughput, resource utilization, and risk. Its use for risk analysis is demonstrated through a hypothetical case study that uses the Monte Carlo simulation technique to imitate the randomness inherent in manufacturing subject to technical and market uncertainties. The case study addresses whether start-up companies should invest in a stainless steel pilot plant or use disposable equipment for the production of early phase clinical trial material. The effects of fluctuating product demands and titers on the performance of a biopharmaceutical company manufacturing clinical trial material are analyzed. The analysis highlights the impact of different manufacturing options on the range in possible outcomes for the project throughput and cost of goods and the likelihood that these metrics exceed a critical threshold. The simulation studies highlight the benefits of incorporating uncertainties when evaluating manufacturing strategies. Methods of presenting and analyzing information generated by the simulations are suggested. These are used to help determine the ranking of alternatives under different scenarios. The example illustrates the benefits to companies of using such a tool to improve management of their R&D portfolios so as to control the cost of goods.

  8. Optimal control problems of epidemic systems with parameter uncertainties: application to a malaria two-age-classes transmission model with asymptomatic carriers.

    PubMed

    Mwanga, Gasper G; Haario, Heikki; Capasso, Vicenzo

    2015-03-01

    The main scope of this paper is to study the optimal control practices of malaria, by discussing the implementation of a catalog of optimal control strategies in presence of parameter uncertainties, which is typical of infectious diseases data. In this study we focus on a deterministic mathematical model for the transmission of malaria, including in particular asymptomatic carriers and two age classes in the human population. A partial qualitative analysis of the relevant ODE system has been carried out, leading to a realistic threshold parameter. For the deterministic model under consideration, four possible control strategies have been analyzed: the use of Long-lasting treated mosquito nets, indoor residual spraying, screening and treatment of symptomatic and asymptomatic individuals. The numerical results show that using optimal control the disease can be brought to a stable disease free equilibrium when all four controls are used. The Incremental Cost-Effectiveness Ratio (ICER) for all possible combinations of the disease-control measures is determined. The numerical simulations of the optimal control in the presence of parameter uncertainty demonstrate the robustness of the optimal control: the main conclusions of the optimal control remain unchanged, even if inevitable variability remains in the control profiles. The results provide a promising framework for the designing of cost-effective strategies for disease controls with multiple interventions, even under considerable uncertainty of model parameters. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Assumptions and Grand Strategy

    DTIC Science & Technology

    2011-01-01

    Germany; The Continuity of Change,” in National Security Cultures: Patterns of Global Governance, ed. Emil Kirchner and James Sperling (London...Britain in an Age of Uncertainty: The National Security Strategy (October 2010), 10. 25. Carl von Clausewitz, On War, edited and translated by Michael E

  10. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    DOE PAGES

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; ...

    2017-01-23

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scanmore » geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time–space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. Lastly, it was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.« less

  11. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS III: BOUNDARY AND INITIAL CONDITIONS, MODEL GRID RESOLUTION, AND HG(II) REDUCTION MECHANISMS

    EPA Science Inventory

    In this study we investigate the CMAQ model response in terms of simulated mercury concentration and deposition to boundary/initial conditions (BC/IC), model grid resolution (12- versus 36-km), and two alternative Hg(II) reduction mechanisms. The model response to the change of g...

  12. Representation of Odds in Terms of Frequencies Reduces Probability Discounting

    ERIC Educational Resources Information Center

    Yi, Richard; Bickel, Warren K.

    2005-01-01

    In studies of probability discounting, the reduction in the value of an outcome as a result of its degree of uncertainty is calculated. Decision making studies suggest two issues with probability that may play a role in data obtained in probability discounting studies. The first issue involves the reduction of risk aversion via subdivision of…

  13. The effects of harvest on waterfowl populations

    USGS Publications Warehouse

    Cooch, Evan G.; Guillemain, Matthieu; Boomer, G Scott; Lebreton, Jean-Dominique; Nichols, James D.

    2014-01-01

    Overall, there is substantial uncertainty about system dynamics, about the impacts of potential management and conservation decisions on those dynamics, and how to optimise management decisions in the presence of such uncertainties. Such relationships are unlikely to be stationary over space or time, and selective harvest of some individuals can potentially alter life history allocation of resources over time – both of which will potentially influence optimal harvest strategies. These sources of variation and uncertainty argue for the use of adaptive approaches to waterfowl harvest management.

  14. Price-cap Regulation, Uncertainty and the Price Evolution of New Pharmaceuticals.

    PubMed

    Shajarizadeh, Ali; Hollis, Aidan

    2015-08-01

    This paper examines the effect of the regulations restricting price increases on the evolution of pharmaceutical prices. A novel theoretical model shows that this policy leads firms to price new drugs with uncertain demand above the expected value initially. Price decreases after drug launch are more likely, the higher the uncertainty. We empirically test the model's predictions using data from the Canadian pharmaceutical market. The level of uncertainty is shown to play a crucial role in drug pricing strategies. © 2014 The Authors. Health Economics Published by John Wiley & Sons Ltd.

  15. The new g-2 experiment at Fermilab

    NASA Astrophysics Data System (ADS)

    Anastasi, A.

    2017-04-01

    There is a long standing discrepancy between the Standard Model prediction for the muon g-2 and the value measured by the Brookhaven E821 Experiment. At present the discrepancy stands at about three standard deviations, with an uncertainty dominated by the theoretical error. Two new proposals - at Fermilab and J-PARC - plan to improve the experimental uncertainty by a factor of 4, and it is expected that there will be a significant reduction in the uncertainty of the Standard Model prediction. I will review the status of the planned experiment at Fermilab, E989, which will analyse 21 times more muons than the BNL experiment and discuss how the systematic uncertainty will be reduced by a factor of 3 such that a precision of 0.14 ppm can be achieved.

  16. Reducing uncertainties for short lived cumulative fission product yields

    DOE PAGES

    Stave, Sean; Prinke, Amanda; Greenwood, Larry; ...

    2015-09-05

    Uncertainties associated with short lived (halflives less than 1 day) fission product yields listed in databases such as the National Nuclear Data Center’s ENDF/B-VII are large enough for certain isotopes to provide an opportunity for new precision measurements to offer significant uncertainty reductions. A series of experiments has begun where small samples of 235U are irradiated with a pulsed, fission neutron spectrum at the Nevada National Security Site and placed between two broad-energy germanium detectors. The amount of various isotopes present immediately following the irradiation can be determined given the total counts and the calibrated properties of the detector system.more » The uncertainty on the fission yields for multiple isotopes has been reduced by nearly an order of magnitude.« less

  17. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... responsible for identifying/estimating risks and for appropriate risk reduction strategies? 102-80.50 Section... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for...

  18. Six steps to a successful dose-reduction strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, M.

    1995-03-01

    The increased importance of demonstrating achievement of the ALARA principle has helped produce a proliferation of dose-reduction ideas. Across a company there may be many dose-reduction items being pursued in a variety of areas. However, companies have a limited amount of resource and, therefore, to ensure funding is directed to those items which will produce the most benefit and that all areas apply a common policy, requires the presence of a dose-reduction strategy. Six steps were identified in formulating the dose-reduction strategy for Rolls-Royce and Associates (RRA): (1) collating the ideas; (2) quantitatively evaluating them on a common basis; (3)more » prioritizing the ideas in terms of cost benefit, (4) implementation of the highest priority items; (5) monitoring their success; (6) periodically reviewing the strategy. Inherent in producing the dose-reduction strategy has been a comprehensive dose database and the RRA-developed dose management computer code DOMAIN, which allows prediction of dose rates and dose. The database enabled high task dose items to be identified, assisted in evaluating dose benefits, and monitored dose trends once items had been implemented. The DOMAIN code was used both in quantifying some of the project dose benefits and its results, such as dose contours, used in some of the dose-reduction items themselves. In all, over fifty dose-reduction items were evaluated in the strategy process and the items which will give greatest benefit are being implemented. The strategy has been successful in giving renewed impetus and direction to dose-reduction management.« less

  19. Simulations of the WFIRST Supernova Survey and Forecasts of Cosmological Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hounsell, R.; Scolnic, D.; Foley, R. J.

    The Wide Field InfraRed Survey Telescope (WFIRST) was the highest rankedlarge space-based mission of the 2010 New Worlds, New Horizons decadal survey.It is now a NASA mission in formulation with a planned launch in themid-2020's. A primary mission objective is to precisely constrain the nature ofdark energy through multiple probes, including Type Ia supernovae (SNe Ia).Here, we present the first realistic simulations of the WFIRST SN survey basedon current hardware specifications and using open-source tools. We simulate SNlight curves and spectra as viewed by the WFIRST wide-field channel (WFC)imager and integral field channel (IFC) spectrometer, respectively. We examine11 survey strategiesmore » with different time allocations between the WFC and IFC,two of which are based upon the strategy described by the WFIRST ScienceDefinition Team, which measures SN distances exclusively from IFC data. Wepropagate statistical and, crucially, systematic uncertainties to predict thedark energy task force figure of merit (DETF FoM) for each strategy. Theincrease in FoM values with SN search area is limited by the overhead times foreach exposure. For IFC-focused strategies the largest individual systematicuncertainty is the wavelength-dependent calibration uncertainty, whereas forWFC-focused strategies, it is the intrinsic scatter uncertainty. We find thatthe best IFC-focused and WFC-exclusive strategies have comparable FoM values.Even without improvements to other cosmological probes, the WFIRST SN surveyhas the potential to increase the FoM by more than an order of magnitude fromthe current values. Although the survey strategies presented here have not beenfully optimized, these initial investigations are an important step in thedevelopment of the final hardware design and implementation of the WFIRSTmission.« less

  20. Assessing the co-benefits of greenhouse gas reduction: health benefits of particulate matter related inspection and maintenance programs in Bangkok, Thailand.

    PubMed

    Li, Ying; Crawford-Brown, Douglas J

    2011-04-15

    Since the 1990s, the capital city of Thailand, Bangkok has been suffering from severe ambient particulate matter (PM) pollution mainly attributable to its wide use of diesel-fueled vehicles and motorcycles with poor emission performance. While the Thai government strives to reduce emissions from transportation through enforcing policy measures, the link between specific control policies and associated health impacts is inadequately studied. This link is especially important in exploring the co-benefits of greenhouse gas emissions reductions, which often brings reduction in other pollutants such as PM. This paper quantifies the health benefits potentially achieved by the new PM-related I/M programs targeting all diesel vehicles and motorcycles in the Bangkok Metropolitan Area (BMA). The benefits are estimated by using a framework that integrates policy scenario development, exposure assessment, exposure-response assessment and economic valuation. The results indicate that the total health damage due to the year 2000 PM emissions from vehicles in the BMA was equivalent to 2.4% of Thailand's GDP. Under the business-as-usual (BAU) scenario, total vehicular PM emissions in the BMA will increase considerably over time due to the rapid growth in vehicle population, even if the fleet average emission rates are projected to decrease over time as the result of participation of Thailand in post-Copenhagen climate change strategies. By 2015, the total health damage is estimated to increase by 2.5 times relative to the year 2000. However, control policies targeting PM emissions from automobiles, such as the PM-oriented I/M programs, could yield substantial health benefits relative to the BAU scenario, and serve as co-benefits of greenhouse gas control strategies. Despite uncertainty associated with the key assumptions used to estimate benefits, we find that with a high level confidence, the I/M programs will produce health benefits whose economic impacts considerably outweigh the expenditures on policy implementation. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Evaluation of Contrail Reduction Strategies Based on Aircraft Flight Distances

    NASA Technical Reports Server (NTRS)

    Chen, Neil Y.; Sridhar, Banavar; Li, Jinhua; Ng, Hok Kwan

    2012-01-01

    This paper evaluates a set of contrail reduction strategies based on the flight range of aircraft as contrail reduction strategies have different impacts on aircraft depending on how they plan to fly. In general, aircraft with longer flight distances cruise at the altitudes where contrails are more likely to form. The concept of the contrail frequency index is used to quantify contrail impacts. The strategy for reducing the persistent contrail formation is to minimize the contrail frequency index by altering the aircraft's cruising altitude. A user-defined factor is used to trade off between contrail reduction and extra CO2 emissions. A higher value of tradeoff factor results in more contrail reduction and extra CO2 emissions. Results show that contrail reduction strategies using various tradeo factors behave differently from short-range flights to long-range ights. Analysis shows that short-distance flights (less than 500 miles) are the most frequent flights but contribute least to contrail reduction. Therefore these aircraft have the lowest priority when applying contrail reduction strategies. Medium-distance flights (500 to 1000 miles) have a higher priority if the goal is to achieve maximum contrail reduction in total; long-distance flights (1000 to 1500 miles) have a higher priority if the goal is to achieve maximum contrail reduction per flight. The characteristics of transcontinental flights (greater than 1500 miles) vary with different weather days so the priority of applying contrail reduction strategies to the group needs to be evaluated based on the locations of the contrail areas during any given day. For the days tested, medium-distance ights contribute up to 42.6% of the reduction among the groups during a day. The contrail frequency index per 1,000 miles for medium-distance, long-distance, and transcontinental flights can be reduced by an average of 75%. The results provide a starting point for developing operational policies to reduce the impact of aviation on climate based on aircraft flight distances.

  2. Interval type-2 fuzzy PID controller for uncertain nonlinear inverted pendulum system.

    PubMed

    El-Bardini, Mohammad; El-Nagar, Ahmad M

    2014-05-01

    In this paper, the interval type-2 fuzzy proportional-integral-derivative controller (IT2F-PID) is proposed for controlling an inverted pendulum on a cart system with an uncertain model. The proposed controller is designed using a new method of type-reduction that we have proposed, which is called the simplified type-reduction method. The proposed IT2F-PID controller is able to handle the effect of structure uncertainties due to the structure of the interval type-2 fuzzy logic system (IT2-FLS). The results of the proposed IT2F-PID controller using a new method of type-reduction are compared with the other proposed IT2F-PID controller using the uncertainty bound method and the type-1 fuzzy PID controller (T1F-PID). The simulation and practical results show that the performance of the proposed controller is significantly improved compared with the T1F-PID controller. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Adaptive Control Based Harvesting Strategy for a Predator-Prey Dynamical System.

    PubMed

    Sen, Moitri; Simha, Ashutosh; Raha, Soumyendu

    2018-04-23

    This paper deals with designing a harvesting control strategy for a predator-prey dynamical system, with parametric uncertainties and exogenous disturbances. A feedback control law for the harvesting rate of the predator is formulated such that the population dynamics is asymptotically stabilized at a positive operating point, while maintaining a positive, steady state harvesting rate. The hierarchical block strict feedback structure of the dynamics is exploited in designing a backstepping control law, based on Lyapunov theory. In order to account for unknown parameters, an adaptive control strategy has been proposed in which the control law depends on an adaptive variable which tracks the unknown parameter. Further, a switching component has been incorporated to robustify the control performance against bounded disturbances. Proofs have been provided to show that the proposed adaptive control strategy ensures asymptotic stability of the dynamics at a desired operating point, as well as exact parameter learning in the disturbance-free case and learning with bounded error in the disturbance prone case. The dynamics, with uncertainty in the death rate of the predator, subjected to a bounded disturbance has been simulated with the proposed control strategy.

  4. Formal modeling of a system of chemical reactions under uncertainty.

    PubMed

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  5. The value of information for woodland management: Updating a state–transition model

    USGS Publications Warehouse

    Morris, William K.; Runge, Michael C.; Vesk, Peter A.

    2017-01-01

    Value of information (VOI) analyses reveal the expected benefit of reducing uncertainty to a decision maker. Most ecological VOI analyses have focused on population models rarely addressing more complex community models. We performed a VOI analysis for a complex state–transition model of Box-Ironbark Forest and Woodland management. With three management alternatives (limited harvest/firewood removal (HF), ecological thinning (ET), and no management), managing the system optimally (for 150 yr) with the original information would, on average, increase the amount of forest in a desirable state from 19% to 35% (a 16-percentage point increase). Resolving all uncertainty would, on average, increase the final percentage to 42% (a 19-percentage point increase). However, only resolving the uncertainty for a single parameter was worth almost two-thirds the value of resolving all uncertainty. We found the VOI to depend on the number of management options, increasing as the management flexibility increased. Our analyses show it is more cost-effective to monitor low-density regrowth forest than other states and more cost-effective to experiment with the no-management alternative than the other management alternatives. Importantly, the most cost-effective strategies did not include either the most desired forest states or the least understood management strategy, ET. This implies that managers cannot just rely on intuition to tell them where the most VOI will lie, as critical uncertainties in a complex system are sometimes cryptic.

  6. Land Resources Allocation Strategies in an Urban Area Involving Uncertainty: A Case Study of Suzhou, in the Yangtze River Delta of China

    NASA Astrophysics Data System (ADS)

    Lu, Shasha; Guan, Xingliang; Zhou, Min; Wang, Yang

    2014-05-01

    A large number of mathematical models have been developed to support land resource allocation decisions and land management needs; however, few of them can address various uncertainties that exist in relation to many factors presented in such decisions (e.g., land resource availabilities, land demands, land-use patterns, and social demands, as well as ecological requirements). In this study, a multi-objective interval-stochastic land resource allocation model (MOISLAM) was developed for tackling uncertainty that presents as discrete intervals and/or probability distributions. The developed model improves upon the existing multi-objective programming and inexact optimization approaches. The MOISLAM not only considers economic factors, but also involves food security and eco-environmental constraints; it can, therefore, effectively reflect various interrelations among different aspects in a land resource management system. Moreover, the model can also help examine the reliability of satisfying (or the risk of violating) system constraints under uncertainty. In this study, the MOISLAM was applied to a real case of long-term urban land resource allocation planning in Suzhou, in the Yangtze River Delta of China. Interval solutions associated with different risk levels of constraint violation were obtained. The results are considered useful for generating a range of decision alternatives under various system conditions, and thus helping decision makers to identify a desirable land resource allocation strategy under uncertainty.

  7. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  8. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  9. Learning in Noise: Dynamic Decision-Making in a Variable Environment

    PubMed Central

    Gureckis, Todd M.; Love, Bradley C.

    2009-01-01

    In engineering systems, noise is a curse, obscuring important signals and increasing the uncertainty associated with measurement. However, the negative effects of noise and uncertainty are not universal. In this paper, we examine how people learn sequential control strategies given different sources and amounts of feedback variability. In particular, we consider people’s behavior in a task where short- and long-term rewards are placed in conflict (i.e., the best option in the short-term is worst in the long-term). Consistent with a model based on reinforcement learning principles (Gureckis & Love, in press), we find that learners differentially weight information predictive of the current task state. In particular, when cues that signal state are noisy and uncertain, we find that participants’ ability to identify an optimal strategy is strongly impaired relative to equivalent amounts of uncertainty that obscure the rewards/valuations of those states. In other situations, we find that noise and uncertainty in reward signals may paradoxically improve performance by encouraging exploration. Our results demonstrate how experimentally-manipulated task variability can be used to test predictions about the mechanisms that learners engage in dynamic decision making tasks. PMID:20161328

  10. Multilevel UQ strategies for large-scale multiphysics applications: PSAAP II solar receiver

    NASA Astrophysics Data System (ADS)

    Jofre, Lluis; Geraci, Gianluca; Iaccarino, Gianluca

    2017-06-01

    Uncertainty quantification (UQ) plays a fundamental part in building confidence in predictive science. Of particular interest is the case of modeling and simulating engineering applications where, due to the inherent complexity, many uncertainties naturally arise, e.g. domain geometry, operating conditions, errors induced by modeling assumptions, etc. In this regard, one of the pacing items, especially in high-fidelity computational fluid dynamics (CFD) simulations, is the large amount of computing resources typically required to propagate incertitude through the models. Upcoming exascale supercomputers will significantly increase the available computational power. However, UQ approaches cannot entrust their applicability only on brute force Monte Carlo (MC) sampling; the large number of uncertainty sources and the presence of nonlinearities in the solution will make straightforward MC analysis unaffordable. Therefore, this work explores the multilevel MC strategy, and its extension to multi-fidelity and time convergence, to accelerate the estimation of the effect of uncertainties. The approach is described in detail, and its performance demonstrated on a radiated turbulent particle-laden flow case relevant to solar energy receivers (PSAAP II: Particle-laden turbulence in a radiation environment). Investigation funded by DoE's NNSA under PSAAP II.

  11. Finding optimal vaccination strategies under parameter uncertainty using stochastic programming.

    PubMed

    Tanner, Matthew W; Sattenspiel, Lisa; Ntaimo, Lewis

    2008-10-01

    We present a stochastic programming framework for finding the optimal vaccination policy for controlling infectious disease epidemics under parameter uncertainty. Stochastic programming is a popular framework for including the effects of parameter uncertainty in a mathematical optimization model. The problem is initially formulated to find the minimum cost vaccination policy under a chance-constraint. The chance-constraint requires that the probability that R(*)

  12. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    NASA Astrophysics Data System (ADS)

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  13. Modeling and sliding mode predictive control of the ultra-supercritical boiler-turbine system with uncertainties and input constraints.

    PubMed

    Tian, Zhen; Yuan, Jingqi; Zhang, Xiang; Kong, Lei; Wang, Jingcheng

    2018-05-01

    The coordinated control system (CCS) serves as an important role in load regulation, efficiency optimization and pollutant reduction for coal-fired power plants. The CCS faces with tough challenges, such as the wide-range load variation, various uncertainties and constraints. This paper aims to improve the load tacking ability and robustness for boiler-turbine units under wide-range operation. To capture the key dynamics of the ultra-supercritical boiler-turbine system, a nonlinear control-oriented model is developed based on mechanism analysis and model reduction techniques, which is validated with the history operation data of a real 1000 MW unit. To simultaneously address the issues of uncertainties and input constraints, a discrete-time sliding mode predictive controller (SMPC) is designed with the dual-mode control law. Moreover, the input-to-state stability and robustness of the closed-loop system are proved. Simulation results are presented to illustrate the effectiveness of the proposed control scheme, which achieves good tracking performance, disturbance rejection ability and compatibility to input constraints. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    PubMed

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  15. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty

    PubMed Central

    2017-01-01

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. PMID:28626019

  16. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn; Forsgren, Anders

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goalsmore » to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.« less

  17. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    PubMed

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Climate change on the Colorado River: a method to search for robust management strategies

    NASA Astrophysics Data System (ADS)

    Keefe, R.; Fischbach, J. R.

    2010-12-01

    The Colorado River is a principal source of water for the seven Basin States, providing approximately 16.5 maf per year to users in the southwestern United States and Mexico. Though the dynamics of the river ensure Upper Basin users a reliable supply of water, the three Lower Basin states (California, Nevada, and Arizona) are in danger of delivery interruptions as Upper Basin demand increases and climate change threatens to reduce future streamflows. In light of the recent drought and uncertain effects of climate change on Colorado River flows, we evaluate the performance of a suite of policies modeled after the shortage sharing agreement adopted in December 2007 by the Department of the Interior. We build on the current literature by using a simplified model of the Lower Colorado River to consider future streamflow scenarios given climate change uncertainty. We also generate different scenarios of parametric consumptive use growth in the Upper Basin and evaluate alternate management strategies in light of these uncertainties. Uncertainty associated with climate change is represented with a multi-model ensemble from the literature, using a nearest neighbor perturbation to increase the size of the ensemble. We use Robust Decision Making to compare near-term or long-term management strategies across an ensemble of plausible future scenarios with the goal of identifying one or more approaches that are robust to alternate assumptions about the future. This method entails using search algorithms to quantitatively identify vulnerabilities that may threaten a given strategy (including the current operating policy) and characterize key tradeoffs between strategies under different scenarios.

  19. Strategies to Prevent Cholera Introduction during International Personnel Deployments: A Computational Modeling Analysis Based on the 2010 Haiti Outbreak

    PubMed Central

    Lewnard, Joseph A.; Antillón, Marina; Gonsalves, Gregg; Miller, Alice M.; Ko, Albert I.; Pitzer, Virginia E.

    2016-01-01

    Background Introduction of Vibrio cholerae to Haiti during the deployment of United Nations (UN) peacekeepers in 2010 resulted in one of the largest cholera epidemics of the modern era. Following the outbreak, a UN-commissioned independent panel recommended three pre-deployment intervention strategies to minimize the risk of cholera introduction in future peacekeeping operations: screening for V. cholerae carriage, administering prophylactic antimicrobial chemotherapies, or immunizing with oral cholera vaccines. However, uncertainty regarding the effectiveness of these approaches has forestalled their implementation by the UN. We assessed how the interventions would have impacted the likelihood of the Haiti cholera epidemic. Methods and Findings We developed a stochastic model for cholera importation and transmission, fitted to reported cases during the first weeks of the 2010 outbreak in Haiti. Using this model, we estimated that diagnostic screening reduces the probability of cases occurring by 82% (95% credible interval: 75%, 85%); however, false-positive test outcomes may hamper this approach. Antimicrobial chemoprophylaxis at time of departure and oral cholera vaccination reduce the probability of cases by 50% (41%, 57%) and by up to 61% (58%, 63%), respectively. Chemoprophylaxis beginning 1 wk before departure confers a 91% (78%, 96%) reduction independently, and up to a 98% reduction (94%, 99%) if coupled with vaccination. These results are not sensitive to assumptions about the background cholera incidence rate in the endemic troop-sending country. Further research is needed to (1) validate the sensitivity and specificity of rapid test approaches for detecting asymptomatic carriage, (2) compare prophylactic efficacy across antimicrobial regimens, and (3) quantify the impact of oral cholera vaccine on transmission from asymptomatic carriers. Conclusions Screening, chemoprophylaxis, and vaccination are all effective strategies to prevent cholera introduction during large-scale personnel deployments such as that precipitating the 2010 Haiti outbreak. Antimicrobial chemoprophylaxis was estimated to provide the greatest protection at the lowest cost among the approaches recently evaluated by the UN. PMID:26812236

  20. In-situ Measurements of Ozone Production Rates and Comparisons to Model-derived Production Rates During the Houston, TX and Denver, CO DISCOVER-AQ Campaigns

    NASA Astrophysics Data System (ADS)

    Baier, B. C.; Brune, W. H.; Miller, D. O.; Lefer, B. L.

    2015-12-01

    Tropospheric ozone (O3) is a secondary pollutant that has harmful effects on human and plant life. The climate and urban emissions in Houston, TX and Denver, CO can be conducive for significant ozone production and thus, high ozone events. Tighter government strategies for ozone mitigation have been proposed, which involve reducing the current EPA eight-hour ozone standard from 75 ppb to 65-70 ppb. These strategies rely on the reduction of ozone precursors in order to decrease the ozone production rate, P(O3). The changes in the ozone concentration at a certain location are dependent upon P(O3), so decreasing P(O3) can decrease ozone levels provided that it has not been transported from other areas. Air quality models test reduction strategies before they are implemented, locate ozone sources, and predict ozone episodes. Traditionally, P(O3) has been calculated by models. However, large uncertainties in model emissions inventories, chemical mechanisms, and meteorology can reduce confidence in this approach. A new instrument, the Measurement of Ozone Production Sensor (MOPS) directly measures P(O3) and can provide an alternate approach to determining P(O3). An updated version of the Penn State MOPS (MOPSv2.0) was deployed to Houston, TX and Denver, CO as a part of NASA's DISCOVER-AQ field campaign in the summers of 2013 and 2014, respectively. We present MOPS directly-measured P(O3) rates from these areas, as well as comparisons to zero-dimensional and three-dimensional modeled P(O3) using the RACM2 and MCMv2.2 mechanisms. These comparisons demonstrate the potential of the MOPS to test and evaluate model-derived P(O3), to advance the understanding of model chemical mechanisms, and to improve predictions of high ozone events.

  1. A contribution to the calculation of measurement uncertainty and optimization of measuring strategies in coordinate measurement

    NASA Astrophysics Data System (ADS)

    Waeldele, F.

    1983-01-01

    The influence of sample shape deviations on the measurement uncertainties and the optimization of computer aided coordinate measurement were investigated for a circle and a cylinder. Using the complete error propagation law in matrix form the parameter uncertainties are calculated, taking the correlation between the measurement points into account. Theoretical investigations show that the measuring points have to be equidistantly distributed and that for a cylindrical body a measuring point distribution along a cross section is better than along a helical line. The theoretically obtained expressions to calculate the uncertainties prove to be a good estimation basis. The simple error theory is not satisfactory for estimation. The complete statistical data analysis theory helps to avoid aggravating measurement errors and to adjust the number of measuring points to the required measuring uncertainty.

  2. New analysis strategies for micro aspheric lens metrology

    NASA Astrophysics Data System (ADS)

    Gugsa, Solomon Abebe

    Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.

  3. Calibration-induced uncertainty of the EPIC model to estimate climate change impact on global maize yield

    NASA Astrophysics Data System (ADS)

    Xiong, Wei; Skalský, Rastislav; Porter, Cheryl H.; Balkovič, Juraj; Jones, James W.; Yang, Di

    2016-09-01

    Understanding the interactions between agricultural production and climate is necessary for sound decision-making in climate policy. Gridded and high-resolution crop simulation has emerged as a useful tool for building this understanding. Large uncertainty exists in this utilization, obstructing its capacity as a tool to devise adaptation strategies. Increasing focus has been given to sources of uncertainties for climate scenarios, input-data, and model, but uncertainties due to model parameter or calibration are still unknown. Here, we use publicly available geographical data sets as input to the Environmental Policy Integrated Climate model (EPIC) for simulating global-gridded maize yield. Impacts of climate change are assessed up to the year 2099 under a climate scenario generated by HadEM2-ES under RCP 8.5. We apply five strategies by shifting one specific parameter in each simulation to calibrate the model and understand the effects of calibration. Regionalizing crop phenology or harvest index appears effective to calibrate the model for the globe, but using various values of phenology generates pronounced difference in estimated climate impact. However, projected impacts of climate change on global maize production are consistently negative regardless of the parameter being adjusted. Different values of model parameter result in a modest uncertainty at global level, with difference of the global yield change less than 30% by the 2080s. The uncertainty subjects to decrease if applying model calibration or input data quality control. Calibration has a larger effect at local scales, implying the possible types and locations for adaptation.

  4. A Multifaceted Intervention to Improve the Quality of Care of Children in District Hospitals in Kenya: A Cost-Effectiveness Analysis

    PubMed Central

    Barasa, Edwine W.; Ayieko, Philip; Cleary, Susan; English, Mike

    2012-01-01

    Background To improve care for children in district hospitals in Kenya, a multifaceted approach employing guidelines, training, supervision, feedback, and facilitation was developed, for brevity called the Emergency Triage and Treatment Plus (ETAT+) strategy. We assessed the cost effectiveness of the ETAT+ strategy, in Kenyan hospitals. Further, we estimate the costs of scaling up the intervention to Kenya nationally and potential cost effectiveness at scale. Methods and Findings Our cost-effectiveness analysis from the provider's perspective used data from a previously reported cluster randomized trial comparing the full ETAT+ strategy (n = 4 hospitals) with a partial intervention (n = 4 hospitals). Effectiveness was measured using 14 process measures that capture improvements in quality of care; their average was used as a summary measure of quality. Economic costs of the development and implementation of the intervention were determined (2009 US$). Incremental cost-effectiveness ratios were defined as the incremental cost per percentage improvement in (average) quality of care. Probabilistic sensitivity analysis was used to assess uncertainty. The cost per child admission was US$50.74 (95% CI 49.26–67.06) in intervention hospitals compared to US$31.1 (95% CI 30.67–47.18) in control hospitals. Each percentage improvement in average quality of care cost an additional US$0.79 (95% CI 0.19–2.31) per admitted child. The estimated annual cost of nationally scaling up the full intervention was US$3.6 million, approximately 0.6% of the annual child health budget in Kenya. A “what-if” analysis assuming conservative reductions in mortality suggests the incremental cost per disability adjusted life year (DALY) averted by scaling up would vary between US$39.8 and US$398.3. Conclusion Improving quality of care at scale nationally with the full ETAT+ strategy may be affordable for low income countries such as Kenya. Resultant plausible reductions in hospital mortality suggest the intervention could be cost-effective when compared to incremental cost-effectiveness ratios of other priority child health interventions. Please see later in the article for the Editors' Summary PMID:22719233

  5. A multifaceted intervention to improve the quality of care of children in district hospitals in Kenya: a cost-effectiveness analysis.

    PubMed

    Barasa, Edwine W; Ayieko, Philip; Cleary, Susan; English, Mike

    2012-01-01

    To improve care for children in district hospitals in Kenya, a multifaceted approach employing guidelines, training, supervision, feedback, and facilitation was developed, for brevity called the Emergency Triage and Treatment Plus (ETAT+) strategy. We assessed the cost effectiveness of the ETAT+ strategy, in Kenyan hospitals. Further, we estimate the costs of scaling up the intervention to Kenya nationally and potential cost effectiveness at scale. Our cost-effectiveness analysis from the provider's perspective used data from a previously reported cluster randomized trial comparing the full ETAT+ strategy (n = 4 hospitals) with a partial intervention (n = 4 hospitals). Effectiveness was measured using 14 process measures that capture improvements in quality of care; their average was used as a summary measure of quality. Economic costs of the development and implementation of the intervention were determined (2009 US$). Incremental cost-effectiveness ratios were defined as the incremental cost per percentage improvement in (average) quality of care. Probabilistic sensitivity analysis was used to assess uncertainty. The cost per child admission was US$50.74 (95% CI 49.26-67.06) in intervention hospitals compared to US$31.1 (95% CI 30.67-47.18) in control hospitals. Each percentage improvement in average quality of care cost an additional US$0.79 (95% CI 0.19-2.31) per admitted child. The estimated annual cost of nationally scaling up the full intervention was US$3.6 million, approximately 0.6% of the annual child health budget in Kenya. A "what-if" analysis assuming conservative reductions in mortality suggests the incremental cost per disability adjusted life year (DALY) averted by scaling up would vary between US$39.8 and US$398.3. Improving quality of care at scale nationally with the full ETAT+ strategy may be affordable for low income countries such as Kenya. Resultant plausible reductions in hospital mortality suggest the intervention could be cost-effective when compared to incremental cost-effectiveness ratios of other priority child health interventions.

  6. Identifying external nutrient reduction requirements and potential in the hypereutrophic Lake Taihu Basin, China.

    PubMed

    Peng, Jiao-Ting; Zhu, Xiao-Dong; Sun, Xiang; Song, Xiao-Wei

    2018-04-01

    Reducing external nutrient loads is the first step for controlling eutrophication. Here, we identified external nutrient reduction requirements and potential of strategies for achieving reductions to remediate a eutrophic water body, Lake Taihu, China. A mass balance approach based on the entire lake was used to identify nutrient reduction requirements; an empirical export coefficient approach was introduced to estimate the nutrient reduction potential of the overall program on integrated regulation of Taihu Lake Basin (hereafter referred to as the "Guideline"). Reduction requirements included external total nitrogen (TN) and total phosphorus (TP) loads, which should be reduced by 41-55 and 25-50%, respectively, to prevent nutrient accumulation in Lake Taihu and to meet the planned water quality targets. In 2010, which is the most seriously polluted calendar year during the 2008-2014 period, the nutrient reduction requirements were estimated to be 36,819 tons of N and 2442 tons of P, and the potential nutrient reduction strategies would reduce approximately 25,821 tons of N and 3024 tons of P. Since there is a net N remaining in the reduction requirements, it should be the focus and deserves more attention in identifying external nutrient reduction strategies. Moreover, abatement measures outlined in the Guideline with high P reduction potential required large monetary investments. Achieving TP reduction requirement using the cost-effective strategy costs about 80.24 million USD. The design of nutrient reduction strategies should be enacted according to regional and sectoral differences and the cost-effectiveness of abatement measures.

  7. Economic evaluation of neonatal care packages in a cluster-randomized controlled trial in Sylhet, Bangladesh

    PubMed Central

    Shillcutt, Samuel D; Waters, Hugh R; Haider, Sabbir; El Arifeen, Shams; Mannan, Ishtiaq; Seraji, Habibur R; Shah, Rasheduzzaman; Darmstadt, Gary L; Wall, Steve N; Williams, Emma K; Black, Robert E; Santosham, Mathuram; Baqui, Abdullah H

    2013-01-01

    Abstract Objective To evaluate and compare the cost-effectiveness of two strategies for neonatal care in Sylhet division, Bangladesh. Methods In a cluster-randomized controlled trial, two strategies for neonatal care – known as home care and community care – were compared with existing services. For each study arm, economic costs were estimated from a societal perspective, inclusive of programme costs, provider costs and household out-of-pocket payments on care-seeking. Neonatal mortality in each study arm was determined through household surveys. The incremental cost-effectiveness of each strategy – compared with that of the pre-existing levels of maternal and neonatal care – was then estimated. The levels of uncertainty in our estimates were quantified through probabilistic sensitivity analysis. Findings The incremental programme costs of implementing the home-care package were 2939 (95% confidence interval, CI: 1833–7616) United States dollars (US$) per neonatal death averted and US$ 103.49 (95% CI: 64.72–265.93) per disability-adjusted life year (DALY) averted. The corresponding total societal costs were US$ 2971 (95% CI: 1844–7628) and US$ 104.62 (95% CI: 65.15–266.60), respectively. The home-care package was cost-effective – with 95% certainty – if healthy life years were valued above US$ 214 per DALY averted. In contrast, implementation of the community-care strategy led to no reduction in neonatal mortality and did not appear to be cost-effective. Conclusion The home-care package represents a highly cost-effective intervention strategy that should be considered for replication and scale-up in Bangladesh and similar settings elsewhere. PMID:24115797

  8. A decision tree model for the implementation of a safety strategy in the horse-racing industry.

    PubMed

    Hitchens, Peta L; Curry, Beverley; Blizzard, C Leigh; Palmer, Andrew J

    2015-04-01

    The profession of a horse-racing jockey is a dangerous one. We developed a decision tree model quantifying the effects of implementing different safety strategies on jockey fall and injury rates and their associated costs. Data on race-day falls were obtained from stewards' reports from August 2002 to July 2009. Insurance claim data were provided by Principal Racing Authorities and workers' compensation authorities in each jurisdiction. Fall and claim incidence data were used as baseline rates. The model considered (1) the status quo, in which policy was unchanged; and (2) compared it with four hypothetical changes in policy that restricted apprentice jockeys from riding less-accomplished horses, with the aim of improving safety by reducing incidence of injurious jockey falls. Second-order Monte Carlo simulations were conducted to account for uncertainties. The point estimate for mean costs of falls under the status quo was $30.73/ride, with falls by apprentice jockeys with <250 career race rides riding horses with less than five race starts contributing the highest costs ($98.49/ride). The hypothetical safety strategies resulted in a 1.04%-5.07% decrease in fall rates versus status quo. For three of the four strategies, significant reductions of 8.74%-13.13% in workers' compensation costs over one single race season were predicted. Costs were highly sensitive to large claims. This model is a useful instrument for comparing potential changes in cost and risks associated with implementing new safety strategies in the horseracing industry. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    NASA Astrophysics Data System (ADS)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  10. Designing optimal greenhouse gas monitoring networks for Australia

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Law, R. M.; Rayner, P. J.; Roff, G.

    2016-01-01

    Atmospheric transport inversion is commonly used to infer greenhouse gas (GHG) flux estimates from concentration measurements. The optimal location of ground-based observing stations that supply these measurements can be determined by network design. Here, we use a Lagrangian particle dispersion model (LPDM) in reverse mode together with a Bayesian inverse modelling framework to derive optimal GHG observing networks for Australia. This extends the network design for carbon dioxide (CO2) performed by Ziehn et al. (2014) to also minimise the uncertainty on the flux estimates for methane (CH4) and nitrous oxide (N2O), both individually and in a combined network using multiple objectives. Optimal networks are generated by adding up to five new stations to the base network, which is defined as two existing stations, Cape Grim and Gunn Point, in southern and northern Australia respectively. The individual networks for CO2, CH4 and N2O and the combined observing network show large similarities because the flux uncertainties for each GHG are dominated by regions of biologically productive land. There is little penalty, in terms of flux uncertainty reduction, for the combined network compared to individually designed networks. The location of the stations in the combined network is sensitive to variations in the assumed data uncertainty across locations. A simple assessment of economic costs has been included in our network design approach, considering both establishment and maintenance costs. Our results suggest that, while site logistics change the optimal network, there is only a small impact on the flux uncertainty reductions achieved with increasing network size.

  11. Fuels planning: science synthesis and integration; economic uses fact sheet 09: Mechanical treatment costs

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2005-01-01

    Although fuel reduction treatments are widespread, there is great variability and uncertainty in the cost of conducting treatments. Researchers from the Rocky Mountain Research Station, USDA Forest Service, have developed a model for estimating the per-acre cost for mechanical fuel reduction treatments. Although these models do a good job of identifying factors that...

  12. Living with uncertainty and hope: A qualitative study exploring parents' experiences of living with childhood multiple sclerosis.

    PubMed

    Hinton, Denise; Kirk, Susan

    2017-06-01

    Background There is growing recognition that multiple sclerosis is a possible, albeit uncommon, diagnosis in childhood. However, very little is known about the experiences of families living with childhood multiple sclerosis and this is the first study to explore this in depth. Objective Our objective was to explore the experiences of parents of children with multiple sclerosis. Methods Qualitative in-depth interviews with 31 parents using a grounded theory approach were conducted. Parents were sampled and recruited via health service and voluntary sector organisations in the United Kingdom. Results Parents' accounts of life with childhood multiple sclerosis were dominated by feelings of uncertainty associated with four sources; diagnostic uncertainty, daily uncertainty, interaction uncertainty and future uncertainty. Parents attempted to manage these uncertainties using specific strategies, which could in turn create further uncertainties about their child's illness. However, over time, ongoing uncertainty appeared to give parents hope for their child's future with multiple sclerosis. Conclusion Illness-related uncertainties appear to play a role in generating hope among parents of a child with multiple sclerosis. However, this may lead parents to avoid sources of information and support that threatens their fragile optimism. Professionals need to be sensitive to the role hope plays in supporting parental coping with childhood multiple sclerosis.

  13. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  14. Integrated ground-water monitoring strategy for NRC-licensed facilities and sites: Case study applications

    USGS Publications Warehouse

    Price, V.; Temples, T.; Hodges, R.; Dai, Z.; Watkins, D.; Imrich, J.

    2007-01-01

    This document discusses results of applying the Integrated Ground-Water Monitoring Strategy (the Strategy) to actual waste sites using existing field characterization and monitoring data. The Strategy is a systematic approach to dealing with complex sites. Application of such a systematic approach will reduce uncertainty associated with site analysis, and therefore uncertainty associated with management decisions about a site. The Strategy can be used to guide the development of a ground-water monitoring program or to review an existing one. The sites selected for study fall within a wide range of geologic and climatic settings, waste compositions, and site design characteristics and represent realistic cases that might be encountered by the NRC. No one case study illustrates a comprehensive application of the Strategy using all available site data. Rather, within each case study we focus on certain aspects of the Strategy, to illustrate concepts that can be applied generically to all sites. The test sites selected include:Charleston, South Carolina, Naval Weapons Station,Brookhaven National Laboratory on Long Island, New York,The USGS Amargosa Desert Research Site in Nevada,Rocky Flats in Colorado,C-Area at the Savannah River Site in South Carolina, andThe Hanford 300 Area.A Data Analysis section provides examples of detailed data analysis of monitoring data.

  15. Analyzing Uncertainty and Risk in the Management of Water Resources in the State Of Texas

    NASA Astrophysics Data System (ADS)

    Singh, A.; Hauffpauir, R.; Mishra, S.; Lavenue, M.

    2010-12-01

    The State of Texas updates its state water plan every five years to determine the water demand required to meet its growing population. The plan compiles forecasts of water deficits from state-wide regional water planning groups as well as the water supply strategies to address these deficits. To date, the plan has adopted a deterministic framework, where reference values (e.g., best estimates, worst-case scenario) are used for key factors such as population growth, demand for water, severity of drought, water availability, etc. These key factors can, however, be affected by multiple sources of uncertainties such as - the impact of climate on surface water and groundwater availability, uncertainty in population projections, changes in sectoral composition of the economy, variability in water usage, feasibility of the permitting process, cost of implementation, etc. The objective of this study was to develop a generalized and scalable methodology for addressing uncertainty and risk in water resources management both at the regional and the local water planning level. The study proposes a framework defining the elements of an end-to-end system model that captures the key components of demand, supply and planning modules along with their associated uncertainties. The framework preserves the fundamental elements of the well-established planning process in the State of Texas, promoting an incremental and stakeholder-driven approach to adding different levels of uncertainty (and risk) into the decision-making environment. The uncertainty in the water planning process is broken down into two primary categories: demand uncertainty and supply uncertainty. Uncertainty in Demand is related to the uncertainty in population projections and the per-capita usage rates. Uncertainty in Supply, in turn, is dominated by the uncertainty in future climate conditions. Climate is represented in terms of time series of precipitation, temperature and/or surface evaporation flux for some future time period of interest, which can be obtained as outputs of global climate models (GCMs). These are then linked with hydrologic and water-availability models (WAMs) to estimate water availability for the worst drought conditions under each future climate scenario. Combining the demand scenarios with the water availability scenarios yields multiple scenarios for water shortage (or surplus). Given multiple shortage/surplus scenarios, various water management strategies can be assessed to evaluate the reliability of meeting projected deficits. These reliabilities are then used within a multi-criteria decision-framework to assess trade-offs between various water management objectives, thus helping to make more robust decisions while planning for the water needs of the future.

  16. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.

  17. Sensitivity of collective action to uncertainty about climate tipping points

    NASA Astrophysics Data System (ADS)

    Barrett, Scott; Dannenberg, Astrid

    2014-01-01

    Despite more than two decades of diplomatic effort, concentrations of greenhouse gases continue to trend upwards, creating the risk that we may someday cross a threshold for `dangerous' climate change. Although climate thresholds are very uncertain, new research is trying to devise `early warning signals' of an approaching tipping point. This research offers a tantalizing promise: whereas collective action fails when threshold uncertainty is large, reductions in this uncertainty may bring about the behavioural change needed to avert a climate `catastrophe'. Here we present the results of an experiment, rooted in a game-theoretic model, showing that behaviour differs markedly either side of a dividing line for threshold uncertainty. On one side of the dividing line, where threshold uncertainty is relatively large, free riding proves irresistible and trust illusive, making it virtually inevitable that the tipping point will be crossed. On the other side, where threshold uncertainty is small, the incentive to coordinate is strong and trust more robust, often leading the players to avoid crossing the tipping point. Our results show that uncertainty must be reduced to this `good' side of the dividing line to stimulate the behavioural shift needed to avoid `dangerous' climate change.

  18. Irrigation offsets wheat yield reductions from warming temperatures

    NASA Astrophysics Data System (ADS)

    Tack, Jesse; Barkley, Andrew; Hendricks, Nathan

    2017-11-01

    Temperature increases due to climate change are expected to cause substantial reductions in global wheat yields. However, uncertainty remains regarding the potential role for irrigation as an adaptation strategy to offset heat impacts. Here we utilize over 7000 observations spanning eleven Kansas field-trial locations, 180 varieties, and 29 years to show that irrigation significantly reduces the negative impact of warming temperatures on winter wheat yields. Dryland wheat yields are estimated to decrease about eight percent for every one-degree Celsius increase in temperature, yet irrigation completely offsets this negative impact in our sample. As in previous studies, we find that important interactions exist between heat stress and precipitation for dryland production. Here, uniquely, we observe both dryland and irrigated trials side-by-side at the same locations and find that precipitation does not provide the same reduction in heat stress as irrigation. This is likely to be because the timing, intensity, and volume of water applications influence wheat yields, so the ability to irrigate—rather than relying on rainfall alone—has a stronger influence on heat stress. We find evidence of extensive differences of water-deficit stress impacts across varieties. This provides some evidence of the potential for adapting to hotter and drier climate conditions using optimal variety selection. Overall, our results highlight the critical role of water management for future global food security. Water scarcity not only reduces crop yields through water-deficit stress, but also amplifies the negative effects of warming temperatures.

  19. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2006-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  20. Managing Space Radiation Risks On Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  1. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.; Ponomarev, A.; Ren, L.; Shavers, M. R.; Wu, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  2. [Spanish Interdisciplinary Committee for Cardiovascular Disease Prevention and the Spanish Society of Cardiology Position Statement on Dyslipidemia Management. Differences Between the European and American Guidelines].

    PubMed

    Lobos Bejarano, José María; Galve, Enrique; Royo-Bordonada, Miguel Ángel; Alegría Ezquerra, Eduardo; Armario, Pedro; Brotons Cuixart, Carlos; Camafort Babkowski, Miguel; Cordero Fort, Alberto; Maiques Galán, Antonio; Mantilla Morató, Teresa; Pérez Pérez, Antonio; Pedro-Botet, Juan; Villar Álvarez, Fernando; González-Juanatey, José Ramón

    2015-01-01

    The publication of the 2013 American College of Cardiology/American Heart Association guidelines on the treatment of high blood cholesterol has had a strong impact due to the paradigm shift in its recommendations. The Spanish Interdisciplinary Committee for Cardiovascular Disease Prevention and the Spanish Society of Cardiology reviewed this guideline and compared it with current European guidelines on cardiovascular prevention and dyslipidemia management. The most striking aspect of the American guideline is the elimination of the low-density lipoprotein cholesterol treat-to-target strategy and the adoption of a risk reduction strategy in 4 major statin benefit groups. In patients with established cardiovascular disease, both guidelines recommend a similar therapeutic strategy (high-dose potent statins). However, in primary prevention, the application of the American guidelines would substantially increase the number of persons, particularly older people, receiving statin therapy. The elimination of the cholesterol treat-to-target strategy, so strongly rooted in the scientific community, could have a negative impact on clinical practice, create a certain amount of confusion and uncertainty among professionals, and decrease follow-up and patient adherence. Thus, this article reaffirms the recommendations of the European guidelines. Although both guidelines have positive aspects, doubt remains regarding the concerns outlined above. In addition to using risk charts based on the native population, the messages of the European guideline are more appropriate to the Spanish setting and avoid the possible risk of overtreatment with statins in primary prevention. Copyright © 2014 SEHLELHA. Published by Elsevier Espana. All rights reserved.

  3. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  4. Benefits of on-wafer calibration standards fabricated in membrane technology

    NASA Astrophysics Data System (ADS)

    Rohland, M.; Arz, U.; Büttgenbach, S.

    2011-07-01

    In this work we compare on-wafer calibration standards fabricated in membrane technology with standards built in conventional thin-film technology. We perform this comparison by investigating the propagation of uncertainties in the geometry and material properties to the broadband electrical properties of the standards. For coplanar waveguides used as line standards the analysis based on Monte Carlo simulations demonstrates an up to tenfold reduction in uncertainty depending on the electromagnetic waveguide property we look at.

  5. Decision-making and evacuation planning for flood risk management in the Netherlands.

    PubMed

    Kolen, Bas; Helsloot, Ira

    2014-07-01

    A traditional view of decision-making for evacuation planning is that, given an uncertain threat, there is a deterministic way of defining the best decision. In other words, there is a linear relation between threat, decision, and execution consequences. Alternatives and the impact of uncertainties are not taken into account. This study considers the 'top strategic decision-making' for mass evacuation owing to flooding in the Netherlands. It reveals that the top strategic decision-making process itself is probabilistic because of the decision-makers involved and their crisis managers (as advisers). The paper concludes that deterministic planning is not sufficient, and it recommends probabilistic planning that considers uncertainties in the decision-making process itself as well as other uncertainties, such as forecasts, citizens responses, and the capacity of infrastructure. This results in less optimistic, but more realistic, strategies and a need to pay attention to alternative strategies. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  6. The development and application of multi-criteria decision-making tool with consideration of uncertainty: the selection of a management strategy for the bio-degradable fraction in the municipal solid waste.

    PubMed

    El Hanandeh, Ali; El-Zein, Abbas

    2010-01-01

    A modified version of the multi-criteria decision aid, ELECTRE III has been developed to account for uncertainty in criteria weightings and threshold values. The new procedure, called ELECTRE-SS, modifies the exploitation phase in ELECTRE III, through a new definition of the pre-order and the introduction of a ranking index (RI). The new approach accommodates cases where incomplete or uncertain preference data are present. The method is applied to a case of selecting a management strategy for the bio-degradable fraction in the municipal solid waste of Sydney. Ten alternatives are compared against 11 criteria. The results show that anaerobic digestion (AD) and composting of paper are less environmentally sound options than recycling. AD is likely to out-perform incineration where a market for heating does not exist. Moreover, landfilling can be a sound alternative, when considering overall performance and conditions of uncertainty.

  7. Panaceas, uncertainty, and the robust control framework in sustainability science

    PubMed Central

    Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan

    2007-01-01

    A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574

  8. Incorporating the effects of socioeconomic uncertainty into priority setting for conservation investment.

    PubMed

    McBride, Marissa F; Wilson, Kerrie A; Bode, Michael; Possingham, Hugh P

    2007-12-01

    Uncertainty in the implementation and outcomes of conservation actions that is not accounted for leaves conservation plans vulnerable to potential changes in future conditions. We used a decision-theoretic approach to investigate the effects of two types of investment uncertainty on the optimal allocation of global conservation resources for land acquisition in the Mediterranean Basin. We considered uncertainty about (1) whether investment will continue and (2) whether the acquired biodiversity assets are secure, which we termed transaction uncertainty and performance uncertainty, respectively. We also developed and tested the robustness of different rules of thumb for guiding the allocation of conservation resources when these sources of uncertainty exist. In the presence of uncertainty in future investment ability (transaction uncertainty), the optimal strategy was opportunistic, meaning the investment priority should be to act where uncertainty is highest while investment remains possible. When there was a probability that investments would fail (performance uncertainty), the optimal solution became a complex trade-off between the immediate biodiversity benefits of acting in a region and the perceived longevity of the investment. In general, regions were prioritized for investment when they had the greatest performance certainty, even if an alternative region was highly threatened or had higher biodiversity value. The improved performance of rules of thumb when accounting for uncertainty highlights the importance of explicitly incorporating sources of investment uncertainty and evaluating potential conservation investments in the context of their likely long-term success.

  9. Achieving Realistic Energy and Greenhouse Gas Emission Reductions in U.S. Cities

    NASA Astrophysics Data System (ADS)

    Blackhurst, Michael F.

    2011-12-01

    In recognizing that energy markets and greenhouse gas emissions are significantly influences by local factors, this research examines opportunities for achieving realistic energy greenhouse gas emissions from U.S. cities through provisions of more sustainable infrastructure. Greenhouse gas reduction opportunities are examined through the lens of a public program administrator charged with reducing emissions given realistic financial constraints and authority over emissions reductions and energy use. Opportunities are evaluated with respect to traditional public policy metrics, such as benefit-cost analysis, net benefit analysis, and cost-effectiveness. Section 2 summarizes current practices used to estimate greenhouse gas emissions from communities. I identify improved and alternative emissions inventory techniques such as disaggregating the sectors reported, reporting inventory uncertainty, and aligning inventories with local organizations that could facilitate emissions mitigation. The potential advantages and challenges of supplementing inventories with comparative benchmarks are also discussed. Finally, I highlight the need to integrate growth (population and economic) and business as usual implications (such as changes to electricity supply grids) into climate action planning. I demonstrate how these techniques could improve decision making when planning reductions, help communities set meaningful emission reduction targets, and facilitate CAP implementation and progress monitoring. Section 3 evaluates the costs and benefits of building energy efficiency are estimated as a means of reducing greenhouse gas emissions in Pittsburgh, PA and Austin, TX. Two policy objectives were evaluated: maximize GHG reductions given initial budget constraints or maximize social savings given target GHG reductions. This approach explicitly evaluates the trade-offs between three primary and often conflicting program design parameters: initial capital constraints, social savings, and GHG reductions. Results suggest uncertainty in local stocks, demands, and efficiency significantly impacts anticipated outcomes. Annual greenhouse gas reductions of 1 ton CO2 eq/capita/yr in Pittsburgh could cost near nothing or over $20 per capita annually. Capital-constrained policies generate slightly less social savings (a present value of a few hundred dollars per capita) than policies that maximize social savings. However, sectors, technologies, and end uses targeted for intervention vary depending on policy objectives and constraints. The optimal efficiency investment strategy for some end uses varies significantly (in excess of 100%) between Pittsburgh and Austin, suggesting that resources and guidance conducted at the national scale may mislead state and local decision-makers. Section 3 then evaluates the impact of rebound effects on modeled efficiency program outcomes. Differential rebound effects across end-uses can change the optimal program design strategy, i.e., the end-uses and technologies targeted for intervention. The rebound effect results suggest that rebound should be integral to effective efficiency program design. Section 4 evaluates the life cycle assessment costs and benefits of the widespread retrofit of green roofs in a typical urban mixed-use neighborhood. Shadow-cost analysis was used to evaluate the cost-effectiveness of green roofs' many benefits. Results suggest green roofs are currently not cost effective on a private cost basis, but multi-family and commercial building green roofs are competitive when social benefits are included. Multifamily and commercial green roofs are also competitive alternatives for reducing greenhouse gases and storm water run-off. However, green roofs are not competitive energy conservation techniques. GHG impacts are dominated by the material production and use phases. Energy impacts are dominated by the use phase, with urban heat island (UHI) impacts being an order of magnitude higher than direct building impacts. Results highlight the importance of clarifying sustainable infrastructure costs and benefits across many public and private organizations (e.g., private building owners, storm water agencies, efficiency stakeholders, and roofing contractors) to identify appropriate incentives and effective program design strategies. Section 5 synthesizes the work and provides guidance for local and state sustainability program administrators. Section 5 highlights the unrealized social benefits associated with sustainability and reflects upon the role of local and state governments in overcoming barriers to achieving more sustainable infrastructure. Section 5 encourages program administrators to consider their local markets for sustainability as influences by resource pricing, weather, infrastructure condition, jurisdiction, and other factors. The differences between sustainability programming and traditional municipal programming are highlighted, namely that sustainability programming often requires self-selection for participation and is subject to new sources of uncertain regarding user behavior, technology breadth and change, and the scope of costs and benefits. These characteristic issues of sustainable infrastructure opportunities provide new challenges to program administrators, requiring new paradigms and support resources. (Abstract shortened by UMI.)

  10. Economic and environmental costs of regulatory uncertainty for coal-fired power plants.

    PubMed

    Patiño-Echeverri, Dalia; Fischbeck, Paul; Kriegler, Elmar

    2009-02-01

    Uncertainty about the extent and timing of CO2 emissions regulations for the electricity-generating sector exacerbates the difficulty of selecting investment strategies for retrofitting or alternatively replacing existent coal-fired power plants. This may result in inefficient investments imposing economic and environmental costs to society. In this paper, we construct a multiperiod decision model with an embedded multistage stochastic dynamic program minimizing the expected total costs of plant operation, installations, and pollution allowances. We use the model to forecast optimal sequential investment decisions of a power plant operator with and without uncertainty about future CO2 allowance prices. The comparison of the two cases demonstrates that uncertainty on future CO2 emissions regulations might cause significant economic costs and higher air emissions.

  11. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  12. Acting on uncertainty in landscape management—options forestry.

    Treesearch

    Jonathan Thompson

    2005-01-01

    In response to the highly uncertain outcomes inherent in forest management, “options forestry” has been introduced as a novel approach that includes an honest appraisal of uncertainties and learning as a specific objective. The strategy is unique in that it uses a variety of management pathways, all designed to reach the same goal, and structures them in a rigorous...

  13. Orbit control of a stratospheric satellite with parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Xu, Ming; Huo, Wei

    2016-12-01

    When a stratospheric satellite travels by prevailing winds in the stratosphere, its cross-track displacement needs to be controlled to keep a constant latitude orbital flight. To design the orbit control system, a 6 degree-of-freedom (DOF) model of the satellite is established based on the second Lagrangian formulation, it is proven that the input/output feedback linearization theory cannot be directly implemented for the orbit control with this model, thus three subsystem models are deduced from the 6-DOF model to develop a sequential nonlinear control strategy. The control strategy includes an adaptive controller for the balloon-tether subsystem with uncertain balloon parameters, a PD controller based on feedback linearization for the tether-sail subsystem, and a sliding mode controller for the sail-rudder subsystem with uncertain sail parameters. Simulation studies demonstrate that the proposed control strategy is robust to uncertainties and satisfies high precision requirements for the orbit flight of the satellite.

  14. Robust optimization based energy dispatch in smart grids considering demand uncertainty

    NASA Astrophysics Data System (ADS)

    Nassourou, M.; Puig, V.; Blesa, J.

    2017-01-01

    In this study we discuss the application of robust optimization to the problem of economic energy dispatch in smart grids. Robust optimization based MPC strategies for tackling uncertain load demands are developed. Unexpected additive disturbances are modelled by defining an affine dependence between the control inputs and the uncertain load demands. The developed strategies were applied to a hybrid power system connected to an electrical power grid. Furthermore, to demonstrate the superiority of the standard Economic MPC over the MPC tracking, a comparison (e.g average daily cost) between the standard MPC tracking, the standard Economic MPC, and the integration of both in one-layer and two-layer approaches was carried out. The goal of this research is to design a controller based on Economic MPC strategies, that tackles uncertainties, in order to minimise economic costs and guarantee service reliability of the system.

  15. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  16. Developing a conservation strategy to maximize persistence of an endangered freshwater mussel species while considering management effectiveness and cost

    USGS Publications Warehouse

    Smith, David R.; McRae, Sarah E.; Augspurger, Tom; Ratcliffe, Judith A.; Nichols, Robert B.; Eads, Chris B.; Savidge, Tim; Bogan, Arthur E.

    2015-01-01

    We used a structured decision-making process to develop conservation strategies to increase persistence of Dwarf Wedgemussel (Alasmidonta heterodon) in North Carolina, USA, while accounting for uncertainty in management effectiveness and considering costs. Alternative conservation strategies were portfolios of management actions that differed by location of management actions on the landscape. Objectives of the conservation strategy were to maximize species persistence, maintain genetic diversity, maximize public support, and minimize management costs. We compared 4 conservation strategies: 1) the ‘status quo’ strategy represented current management, 2) the ‘protect the best’ strategy focused on protecting the best populations in the Tar River basin, 3) the ‘expand the distribution’ strategy focused on management of extant populations and establishment of new populations in the Neuse River basin, and 4) the ‘hybrid’ strategy combined elements of each strategy to balance conservation in the Tar and Neuse River basins. A population model informed requirements for population management, and experts projected performance of alternative strategies over a 20-y period. The optimal strategy depended on the relative value placed on competing objectives, which can vary among stakeholders. The protect the best and hybrid strategies were optimal across a wide range of relative values with 2 exceptions: 1) if minimizing management cost was of overriding concern, then status quo was optimal, or 2) if maximizing population persistence in the Neuse River basin was emphasized, then expand the distribution strategy was optimal. The optimal strategy was robust to uncertainty in management effectiveness. Overall, the structured decision process can help identify the most promising strategies for endangered species conservation that maximize conservation benefit given the constraint of limited funding.

  17. Comparative health impact assessment of local and regional particulate air pollutants in Scandinavia.

    PubMed

    Forsberg, Bertil; Hansson, Hans-Christen; Johansson, Christer; Areskoug, Hans; Persson, Karin; Järvholm, Bengt

    2005-02-01

    The ongoing program Clean Air for Europe (CAFE) is an initiative from the EU Commission to establish a coordinated effort to reach better air quality in the EU. The focus is on particulate matter as it has been shown to have large impact on human health. CAFE requested that WHO make a review of the latest findings on air pollutants and health to facilitate assessments of the different air pollutants and their health effects. The WHO review project on health aspects of air pollution in Europe confirmed that exposure to particulate matter (PM), despite the lower levels we face today, still poses a significant risk to human health. Using the recommended uniform risk coefficients for health impact assessment of PM, regardless of sources, premature mortality related to long-range transported anthropogenic particles has been estimated to be about 3500 deaths per year for the Swedish population, corresponding to a reduction in life expectancy of up to about seven months. The influence of local sources is more difficult to estimate due to large uncertainties when linking available risk coefficients to exposure data, but the estimates indicate about 1800 deaths brought forward each year with a life expectancy reduction of about 2-3 months. However, some sectors of the population are exposed to quite high locally induced concentrations and are likely to suffer excessive reductions in life expectancy. Since the literature increasingly supports assumptions that combustion related particles are associated with higher relative risks, further studies may shift the focus for abatement strategies. CAFE sets out to establish a general cost effective abatement strategy for atmospheric particles. Our results, based on studies of background exposure, show that long-range transported sulfate rich particles dominate the health effects of PM in Sweden. The same results would be found for the whole of Scandinavia and many countries influenced by transboundary air pollution. However, several health studies, including epidemiological studies with a finer spatial resolution, indicate that engine exhaust particles are more damaging to health than other particles. These contradictory findings must be understood and source specific risk estimates have to be established by expert bodies, otherwise it will not be possible to find the most cost effective abatement strategy for Europe. We are not happy with today's situation where every strategy to reduce PM concentrations is estimated to have the same impact per unit change in the mass concentration. Obviously there is a striking need to introduce more specific exposure variables and a higher geographical resolution in epidemiology as well as in health impact assessments.

  18. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, Jeremy Alan; Blaylock, Myra L.; Domino, Stefan P.

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  19. Development of risk-based air quality management strategies under impacts of climate change.

    PubMed

    Liao, Kuo-Jen; Amar, Praveen; Tagaris, Efthimios; Russell, Armistead G

    2012-05-01

    Climate change is forecast to adversely affect air quality through perturbations in meteorological conditions, photochemical reactions, and precursor emissions. To protect the environment and human health from air pollution, there is an increasing recognition of the necessity of developing effective air quality management strategies under the impacts of climate change. This paper presents a framework for developing risk-based air quality management strategies that can help policy makers improve their decision-making processes in response to current and future climate change about 30-50 years from now. Development of air quality management strategies under the impacts of climate change is fundamentally a risk assessment and risk management process involving four steps: (1) assessment of the impacts of climate change and associated uncertainties; (2) determination of air quality targets; (3) selections of potential air quality management options; and (4) identification of preferred air quality management strategies that minimize control costs, maximize benefits, or limit the adverse effects of climate change on air quality when considering the scarcity of resources. The main challenge relates to the level of uncertainties associated with climate change forecasts and advancements in future control measures, since they will significantly affect the risk assessment results and development of effective air quality management plans. The concept presented in this paper can help decision makers make appropriate responses to climate change, since it provides an integrated approach for climate risk assessment and management when developing air quality management strategies. Development of climate-responsive air quality management strategies is fundamentally a risk assessment and risk management process. The risk assessment process includes quantification of climate change impacts on air quality and associated uncertainties. Risk management for air quality under the impacts of climate change includes determination of air quality targets, selections of potential management options, and identification of effective air quality management strategies through decision-making models. The risk-based decision-making framework can also be applied to develop climate-responsive management strategies for the other environmental dimensions and assess costs and benefits of future environmental management policies.

  20. Optimal control of an invasive species using a reaction-diffusion model and linear programming

    USGS Publications Warehouse

    Bonneau, Mathieu; Johnson, Fred A.; Smith, Brian J.; Romagosa, Christina M.; Martin, Julien; Mazzotti, Frank J.

    2017-01-01

    Managing an invasive species is particularly challenging as little is generally known about the species’ biological characteristics in its new habitat. In practice, removal of individuals often starts before the species is studied to provide the information that will later improve control. Therefore, the locations and the amount of control have to be determined in the face of great uncertainty about the species characteristics and with a limited amount of resources. We propose framing spatial control as a linear programming optimization problem. This formulation, paired with a discrete reaction-diffusion model, permits calculation of an optimal control strategy that minimizes the remaining number of invaders for a fixed cost or that minimizes the control cost for containment or protecting specific areas from invasion. We propose computing the optimal strategy for a range of possible model parameters, representing current uncertainty on the possible invasion scenarios. Then, a best strategy can be identified depending on the risk attitude of the decision-maker. We use this framework to study the spatial control of the Argentine black and white tegus (Salvator merianae) in South Florida. There is uncertainty about tegu demography and we considered several combinations of model parameters, exhibiting various dynamics of invasion. For a fixed one-year budget, we show that the risk-averse strategy, which optimizes the worst-case scenario of tegus’ dynamics, and the risk-neutral strategy, which optimizes the expected scenario, both concentrated control close to the point of introduction. A risk-seeking strategy, which optimizes the best-case scenario, focuses more on models where eradication of the species in a cell is possible and consists of spreading control as much as possible. For the establishment of a containment area, assuming an exponential growth we show that with current control methods it might not be possible to implement such a strategy for some of the models that we considered. Including different possible models allows an examination of how the strategy is expected to perform in different scenarios. Then, a strategy that accounts for the risk attitude of the decision-maker can be designed.

Top