Sample records for uncertainty surrounding indicators

  1. Uncertainty in Population Growth Rates: Determining Confidence Intervals from Point Estimates of Parameters

    PubMed Central

    Devenish Nelson, Eleanor S.; Harris, Stephen; Soulsbury, Carl D.; Richards, Shane A.; Stephens, Philip A.

    2010-01-01

    Background Demographic models are widely used in conservation and management, and their parameterisation often relies on data collected for other purposes. When underlying data lack clear indications of associated uncertainty, modellers often fail to account for that uncertainty in model outputs, such as estimates of population growth. Methodology/Principal Findings We applied a likelihood approach to infer uncertainty retrospectively from point estimates of vital rates. Combining this with resampling techniques and projection modelling, we show that confidence intervals for population growth estimates are easy to derive. We used similar techniques to examine the effects of sample size on uncertainty. Our approach is illustrated using data on the red fox, Vulpes vulpes, a predator of ecological and cultural importance, and the most widespread extant terrestrial mammal. We show that uncertainty surrounding estimated population growth rates can be high, even for relatively well-studied populations. Halving that uncertainty typically requires a quadrupling of sampling effort. Conclusions/Significance Our results compel caution when comparing demographic trends between populations without accounting for uncertainty. Our methods will be widely applicable to demographic studies of many species. PMID:21049049

  2. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    PubMed

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  3. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Treesearch

    Tyler Jon Smith; Lucy Amanda Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  4. Health risks of climate change: an assessment of uncertainties and its implications for adaptation policies.

    PubMed

    Wardekker, J Arjan; de Jong, Arie; van Bree, Leendert; Turkenburg, Wim C; van der Sluijs, Jeroen P

    2012-09-19

    Projections of health risks of climate change are surrounded with uncertainties in knowledge. Understanding of these uncertainties will help the selection of appropriate adaptation policies. We made an inventory of conceivable health impacts of climate change, explored the type and level of uncertainty for each impact, and discussed its implications for adaptation policy. A questionnaire-based expert elicitation was performed using an ordinal scoring scale. Experts were asked to indicate the level of precision with which health risks can be estimated, given the present state of knowledge. We assessed the individual scores, the expertise-weighted descriptive statistics, and the argumentation given for each score. Suggestions were made for how dealing with uncertainties could be taken into account in climate change adaptation policy strategies. The results showed that the direction of change could be indicated for most anticipated health effects. For several potential effects, too little knowledge exists to indicate whether any impact will occur, or whether the impact will be positive or negative. For several effects, rough 'order-of-magnitude' estimates were considered possible. Factors limiting health impact quantification include: lack of data, multi-causality, unknown impacts considering a high-quality health system, complex cause-effect relations leading to multi-directional impacts, possible changes of present-day response-relations, and difficulties in predicting local climate impacts. Participants considered heat-related mortality and non-endemic vector-borne diseases particularly relevant for climate change adaptation. For possible climate related health impacts characterised by ignorance, adaptation policies that focus on enhancing the health system's and society's capability of dealing with possible future changes, uncertainties and surprises (e.g. through resilience, flexibility, and adaptive capacity) are most appropriate. For climate related health effects for which rough risk estimates are available, 'robust decision-making' is recommended. For health effects with limited societal and policy relevance, we recommend focusing on no-regret measures. For highly relevant health effects, precautionary measures can be considered. This study indicated that analysing and characterising uncertainty by means of a typology can be a very useful approach for selection and prioritization of preferred adaptation policies to reduce future climate related health risks.

  5. A crustal seismic velocity model for the UK, Ireland and surrounding seas

    USGS Publications Warehouse

    Kelly, A.; England, R.W.; Maguire, Peter K.H.

    2007-01-01

    A regional model of the 3-D variation in seismic P-wave velocity structure in the crust of NW Europe has been compiled from wide-angle reflection/refraction profiles. Along each 2-D profile a velocity-depth function has been digitised at 5 km intervals. These 1-D velocity functions were mapped into three dimensions using ordinary kriging with weights determined to minimise the difference between digitised and interpolated values. An analysis of variograms of the digitised data suggested a radial isotropic weighting scheme was most appropriate. Horizontal dimensions of the model cells are optimised at 40 ?? 40 km and the vertical dimension at 1 km. The resulting model provides a higher resolution image of the 3-D variation in seismic velocity structure of the UK, Ireland and surrounding areas than existing models. The construction of the model through kriging allows the uncertainty in the velocity structure to be assessed. This uncertainty indicates the high density of data required to confidently interpolate the crustal velocity structure, and shows that for this region the velocity is poorly constrained for large areas away from the input data. ?? 2007 The Authors Journal compilation ?? 2007 RAS.

  6. ISA implementation and uncertainty: a literature review and expert elicitation study.

    PubMed

    van der Pas, J W G M; Marchau, V A W J; Walker, W E; van Wee, G P; Vlassenroot, S H

    2012-09-01

    Each day, an average of over 116 people die from traffic accidents in the European Union. One out of three fatalities is estimated to be the result of speeding. The current state of technology makes it possible to make speeding more difficult, or even impossible, by placing intelligent speed limiters (so called ISA devices) in vehicles. Although the ISA technology has been available for some years now, and reducing the number of road traffic fatalities and injuries has been high on the European political agenda, implementation still seems to be far away. Experts indicate that there are still too many uncertainties surrounding ISA implementation, and dealing with these uncertainties is essential for implementing ISA. In this paper, a systematic and representative inventory of the uncertainties is made based upon the literature. Furthermore, experts in the field of ISA were surveyed and asked which uncertainties are barriers for ISA implementation, and how uncertain these uncertainties are. We found that the long-term effects and the effects of large-scale implementation of ISA are still uncertain and are the most important barriers for the implementation of the most effective types of ISA. One way to deal with these uncertainties would be to start implementation on a small scale and gradually expand the penetration, in order to learn how ISA influences the transport system over time. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Plural Forms of Evidence and Uncertainty in Environmental Health: A Comparison of Two Chinese Cases

    ERIC Educational Resources Information Center

    Lora-Wainwright, Anna

    2013-01-01

    This paper examines the plural forms of evidence of harm presented by the residents of two Chinese villages affected by severe pollution. Conversely, it scrutinises how and why the antonym to evidence--uncertainty--is emphasised and with what effects. It argues that their uncertainty surrounding environmental health harm is a result of the…

  8. Simulation of CO2 Sequestration at Rock Spring Uplift, Wyoming: Heterogeneity and Uncertainties in Storage Capacity, Injectivity and Leakage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Hailin; Dai, Zhenxue; Jiao, Zunsheng

    2011-01-01

    Many geological, geochemical, geomechanical and hydrogeological factors control CO{sub 2} storage in subsurface. Among them heterogeneity in saline aquifer can seriously influence design of injection wells, CO{sub 2} injection rate, CO{sub 2} plume migration, storage capacity, and potential leakage and risk assessment. This study applies indicator geostatistics, transition probability and Markov chain model at the Rock Springs Uplift, Wyoming generating facies-based heterogeneous fields for porosity and permeability in target saline aquifer (Pennsylvanian Weber sandstone) and surrounding rocks (Phosphoria, Madison and cap-rock Chugwater). A multiphase flow simulator FEHM is then used to model injection of CO{sub 2} into the target salinemore » aquifer involving field-scale heterogeneity. The results reveal that (1) CO{sub 2} injection rates in different injection wells significantly change with local permeability distributions; (2) brine production rates in different pumping wells are also significantly impacted by the spatial heterogeneity in permeability; (3) liquid pressure evolution during and after CO{sub 2} injection in saline aquifer varies greatly for different realizations of random permeability fields, and this has potential important effects on hydraulic fracturing of the reservoir rock, reactivation of pre-existing faults and the integrity of the cap-rock; (4) CO{sub 2} storage capacity estimate for Rock Springs Uplift is 6614 {+-} 256 Mt at 95% confidence interval, which is about 36% of previous estimate based on homogeneous and isotropic storage formation; (5) density profiles show that the density of injected CO{sub 2} below 3 km is close to that of the ambient brine with given geothermal gradient and brine concentration, which indicates CO{sub 2} plume can sink to the deep before reaching thermal equilibrium with brine. Finally, we present uncertainty analysis of CO{sub 2} leakage into overlying formations due to heterogeneity in both the target saline aquifer and surrounding formations. This uncertainty in leakage will be used to feed into risk assessment modeling.« less

  9. The American Work Force, 1992-2005. Historical Trends, 1950-92, and Current Uncertainties.

    ERIC Educational Resources Information Center

    Kutscher, Ronald E.

    1993-01-01

    Reviews the trends of the last four decades in terms of the labor force, economics, employment by industry, and employment by occupation. Considers uncertainties surrounding projections to 2005: end of the cold war, European unification, and the North American Free Trade Agreement. (SK)

  10. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  11. Accepting uncertainty, assessing risk: decision quality in managing wildfire, forest resource values, and new technology

    Treesearch

    Jeffrey G. Borchers

    2005-01-01

    The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...

  12. Preliminary volcanic hazards evaluation for Los Alamos National Laboratory Facilities and Operations : current state of knowledge and proposed path forward

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.

    2010-09-01

    The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.

  13. Effectively Communicating the Uncertainties Surrounding Ebola Virus Transmission.

    PubMed

    Kilianski, Andy; Evans, Nicholas G

    2015-10-01

    The current Ebola virus outbreak has highlighted the uncertainties surrounding many aspects of Ebola virus virology, including routes of transmission. The scientific community played a leading role during the outbreak-potentially, the largest of its kind-as many of the questions surrounding ebolaviruses have only been interrogated in the laboratory. Scientists provided an invaluable resource for clinicians, public health officials, policy makers, and the lay public in understanding the progress of Ebola virus disease and the continuing outbreak. Not all of the scientific communication, however, was accurate or effective. There were multiple instances of published articles during the height of the outbreak containing potentially misleading scientific language that spurred media overreaction and potentially jeopardized preparedness and policy decisions at critical points. Here, we use articles declaring the potential for airborne transmission of Ebola virus as a case study in the inaccurate reporting of basic science, and we provide recommendations for improving the communication about unknown aspects of disease during public health crises.

  14. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  15. Sedimentary Records of the Paleohurricane Activity in the Bahamas

    NASA Astrophysics Data System (ADS)

    Wallace, E. J.; Donnelly, J. P.; Wiman, C.; Cashman, M.

    2015-12-01

    Hurricanes pose a threat to human lives and can cause significant destruction of coastal areas. This threat has become more pronounced with recent rises in sea level and coastal populations. Currently, there is a large degree of uncertainty surrounding future changes in tropical cyclone activity. This is due to the limitations of climate models as well as the scarcity and unreliability of the current observational record. With so much uncertainty surrounding the current projections of hurricane activity, it is crucial to establish a longer and more accurate historical record. This study uses sediment cores extracted from blueholes in the Bahamas to develop a record of intense hurricane landfalls in the region dating back more than a millennia. The collected cores were sectioned, split, and scanned on an X-ray fluorescence scanner to obtain a high resolution core profile of the sediments' elemental composition and to identify potential sedimentary structures. Age control of the samples was determined using radiocarbon dating, coarse fraction was measured every centimeter, and hurricane event bed frequency was established for each core. We assess the statistical significance of the patterns observed in the sedimentary record using a coupled ocean-atmosphere hurricane model to simulate storms representative of modern climatology. Cores extracted from two blue holes near South Andros Island provide approximately a 1600 year and a 600 year record respectively, with sedimentation rates exceeding 1 cm/year. Both records contain coarse grained event deposits that correlate with known historical intense hurricane strikes in the Bahamas within age uncertainties. The 1600 year record confirms previous hurricane reconstructions from the Caribbean indicating higher tropical cyclone activity from 500 to 1400 CE. In addition, these new high-resolution records indicate elevated intense hurricane activity in the 17th and 18th centuries CE, when activity is also elevated in lower resolution records from Abaco, Bahamas and Vieques, Puerto Rico. However, records from the northeast United States and Gulf of Mexico are relatively inactive. This spatial variability in intense hurricane landfalls suggests significant regional controls on hurricane activity.

  16. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  17. Unraveling Diagnostic Uncertainty Surrounding Lyme Disease in Children with Neuropsychiatric Illness.

    PubMed

    Koster, Michael P; Garro, Aris

    2018-01-01

    Lyme disease is endemic in parts of the United States, including New England, the Atlantic seaboard, and Great Lakes region. The presentation has various manifestations, many of which can mimic psychiatric diseases in children. Distinguishing manifestations of Lyme disease from those of psychiatric illnesses is complicated by inexact diagnostic tests and misuse of these tests when they are not clinically indicated. This article aims to describe manifestations of Lyme disease in children with an emphasis on Lyme neuroborreliosis. Clinical scenarios will be presented and discussed. Finally, recommendations for clinical psychiatrists who encounter children with possible Lyme disease are presented. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Methods for handling uncertainty within pharmaceutical funding decisions

    NASA Astrophysics Data System (ADS)

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  19. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    PubMed

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  20. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  1. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  2. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  3. Experimental Research Examining How People Can Cope with Uncertainty Through Soft Haptic Sensations.

    PubMed

    van Horen, Femke; Mussweiler, Thomas

    2015-09-16

    Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal events (e.g., economical crises, political revolutions, terrorism threats) of which one is unable to judge the impact on one's future live, cognitive strategies (like seeking additional information) is likely to fail to combat uncertainty. Instead, the current paper discusses a method demonstrating that people might deal with uncertainty experientially through soft haptic sensations. More specifically, because touching something soft creates a feeling of comfort and security, people prefer objects with softer as compared to harder properties when feeling uncertain. Seeking for softness is a highly efficient and effective tool to deal with uncertainty as our hands are available at all times. This protocol describes a set of methods demonstrating 1) how environmental (un)certainty can be situationally activated with an experiential priming procedure, 2) that the quality of the softness experience (what type of softness and how it is experienced) matters and 3) how uncertainty can be reduced using different methods.

  4. Optimal allocation of resources over health care programmes: dealing with decreasing marginal utility and uncertainty.

    PubMed

    Al, Maiwenn J; Feenstra, Talitha L; Hout, Ben A van

    2005-07-01

    This paper addresses the problem of how to value health care programmes with different ratios of costs to effects, specifically when taking into account that these costs and effects are uncertain. First, the traditional framework of maximising health effects with a given health care budget is extended to a flexible budget using a value function over money and health effects. Second, uncertainty surrounding costs and effects is included in the model using expected utility. Other approaches to uncertainty that do not specify a utility function are discussed and it is argued that these also include implicit notions about risk attitude.

  5. Air-water gas exchange and CO2 flux in a mangrove-dominated estuary

    USGS Publications Warehouse

    Ho, David T.; Ferrón, Sara; Engel, Victor C.; Larsen, Laurel G.; Barr, Jordan G.

    2014-01-01

    Mangrove forests are highly productive ecosystems, but the fate of mangrove-derived carbon remains uncertain. Part of that uncertainty stems from the fact that gas transfer velocities in mangrove-surrounded waters are not well determined, leading to uncertainty in air-water CO2 fluxes. Two SF6 tracer release experiments were conducted to determine gas transfer velocities (k(600) = 8.3 ± 0.4 and 8.1 ± 0.6 cm h−1), along with simultaneous measurements of pCO2 to determine the air-water CO2 fluxes from Shark River, Florida (232.11 ± 23.69 and 171.13 ± 20.28 mmol C m−2 d−1), an estuary within the largest contiguous mangrove forest in North America. The gas transfer velocity results are consistent with turbulent kinetic energy dissipation measurements, indicating a higher rate of turbulence and gas exchange than predicted by commonly used wind speed/gas exchange parameterizations. The results have important implications for carbon fluxes in mangrove ecosystems.

  6. Plug-in electric vehicles: future market conditions and adoption rates

    EIA Publications

    2017-01-01

    This report, the first of four Issues in Focus articles from the International Energy Outlook 2017, analyzes the effects of uncertainties in the adoption of plug-in electric vehicles (PEVs) on worldwide transportation energy consumption. Uncertainties surrounding consumer acceptance, vehicle cost, policies, and other market conditions could affect future adoption rates of plug-in electric vehicles. Two side cases are presented in this report that assume different levels of PEV adoption and result in different levels of worldwide transportation energy consumption.

  7. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).

  8. What is white?

    PubMed Central

    Bosten, J. M.; Beer, R. D.; MacLeod, D. I. A.

    2015-01-01

    To shed light on the perceptual basis of the color white, we measured settings of unique white in a dark surround. We find that settings reliably show more variability in an oblique (blue-yellow) direction in color space than along the cardinal axes of the cone-opponent mechanisms. This is against the idea that white perception arises at the null point of the cone-opponent mechanisms, but one alternative possibility is that it occurs through calibration to the visual environment. We found that the locus of maximum variability in settings lies close to the locus of natural daylights, suggesting that variability may result from uncertainty about the color of the illuminant. We tested this by manipulating uncertainty. First, we altered the extent to which the task was absolute (requiring knowledge of the illumination) or relative. We found no clear effect of this factor on the reduction in sensitivity in the blue-yellow direction. Second, we provided a white surround as a cue to the illumination or left the surround dark. Sensitivity was selectively worse in the blue-yellow direction when the surround was black than when it was white. Our results can be functionally related to the statistics of natural images, where a greater blue-yellow dispersion is characteristic of both reflectances (where anisotropy is weak) and illuminants (where it is very pronounced). Mechanistically, the results could suggest a neural signal responsive to deviations from the blue-yellow locus or an adaptively matched range of contrast response functions for signals that encode different directions in color space. PMID:26641948

  9. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  10. Relative Mesothelioma Potencies for Unregulated Respirable Elongated Mineral and Synthetic Particles

    EPA Science Inventory

    For decades uncertainties and contradictions have surrounded the issue of whether exposures to respirable elongated mineral and synthetic particles (REMPs and RESPs) present health risks such as those recognized for exposures to elongated asbestiform mineral particles from the fi...

  11. Measurement of high-energy neutron flux above ground utilizing a spallation based multiplicity technique

    DOE PAGES

    Roecker, Caleb; Bernstein, Adam; Marleau, Peter; ...

    2016-11-14

    Cosmogenic high-energy neutrons are a ubiquitous, difficult to shield, poorly measured background. Above ground the high-energy neutron energy-dependent flux has been measured, with significantly varying results. Below ground, high-energy neutron fluxes are largely unmeasured. Here we present a reconstruction algorithm to unfold the incident neutron energy-dependent flux measured using the Multiplicity and Recoil Spectrometer (MARS), simulated test cases to verify the algorithm, and provide a new measurement of the above ground high-energy neutron energy-dependent flux with a detailed systematic uncertainty analysis. Uncertainty estimates are provided based upon the measurement statistics, the incident angular distribution, the surrounding environment of the Montemore » Carlo model, and the MARS triggering efficiency. Quantified systematic uncertainty is dominated by the assumed incident neutron angular distribution and surrounding environment of the Monte Carlo model. The energy-dependent neutron flux between 90 MeV and 400 MeV is reported. Between 90 MeV and 250 MeV the MARS results are comparable to previous Bonner sphere measurements. Over the total energy regime measured, the MARS result are located within the span of previous measurements. Lastly, these results demonstrate the feasibility of future below ground measurements with MARS.« less

  12. Measurement of high-energy neutron flux above ground utilizing a spallation based multiplicity technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roecker, Caleb; Bernstein, Adam; Marleau, Peter

    Cosmogenic high-energy neutrons are a ubiquitous, difficult to shield, poorly measured background. Above ground the high-energy neutron energy-dependent flux has been measured, with significantly varying results. Below ground, high-energy neutron fluxes are largely unmeasured. Here we present a reconstruction algorithm to unfold the incident neutron energy-dependent flux measured using the Multiplicity and Recoil Spectrometer (MARS), simulated test cases to verify the algorithm, and provide a new measurement of the above ground high-energy neutron energy-dependent flux with a detailed systematic uncertainty analysis. Uncertainty estimates are provided based upon the measurement statistics, the incident angular distribution, the surrounding environment of the Montemore » Carlo model, and the MARS triggering efficiency. Quantified systematic uncertainty is dominated by the assumed incident neutron angular distribution and surrounding environment of the Monte Carlo model. The energy-dependent neutron flux between 90 MeV and 400 MeV is reported. Between 90 MeV and 250 MeV the MARS results are comparable to previous Bonner sphere measurements. Over the total energy regime measured, the MARS result are located within the span of previous measurements. Lastly, these results demonstrate the feasibility of future below ground measurements with MARS.« less

  13. [Optimization of measurement methods for a multi-frequency electromagnetic field from mobile phone base station using broadband EMF meter].

    PubMed

    Bieńkowski, Paweł; Cała, Paweł; Zubrzak, Bartłomiej

    2015-01-01

    This paper presents the characteristics of the mobile phone base station (BS) as an electromagnetic field (EMF) source. The most common system configurations with their construction are described. The parameters of radiated EMF in the context of the access to methods and other parameters of the radio transmission are discussed. Attention was also paid to antennas that are used in this technology. The influence of individual components of a multi-frequency EMF, most commonly found in the BS surroundings, on the resultant EMF strength value indicated by popular broadband EMF meters was analyzed. The examples of metrological characteristics of the most common EMF probes and 2 measurement scenarios of the multisystem base station, with and without microwave relays, are shown. The presented method for measuring the multi-frequency EMF using 2 broadband probes allows for the significant minimization of measurement uncertainty. Equations and formulas that can be used to calculate the actual EMF intensity from multi-frequency sources are shown. They have been verified in the laboratory conditions on a specific standard setup as well as in real conditions in a survey of the existing base station with microwave relays. Presented measurement methodology of multi-frequency EMF from BS with microwave relays, validated both in laboratory and real conditions. It has been proven that the described measurement methodology is the optimal approach to the evaluation of EMF exposure in BS surrounding. Alternative approaches with much greater uncertainty (precaution method) or more complex measuring procedure (sources exclusion method) are also presented). This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. The Low-mass Population in the Young Cluster Stock 8: Stellar Properties and Initial Mass Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jose, Jessy; Herczeg, Gregory J.; Fang, Qiliang

    The evolution of H ii regions/supershells can trigger a new generation of stars/clusters at their peripheries, with environmental conditions that may affect the initial mass function, disk evolution, and star formation efficiency. In this paper we study the stellar content and star formation processes in the young cluster Stock 8, which itself is thought to be formed during the expansion of a supershell. We present deep optical photometry along with JHK and 3.6 and 4.5 μ m photometry from UKIDSS and Spitzer -IRAC. We use multicolor criteria to identify the candidate young stellar objects in the region. Using evolutionary models,more » we obtain a median log(age) of ∼6.5 (∼3.0 Myr) with an observed age spread of ∼0.25 dex for the cluster. Monte Carlo simulations of the population of Stock 8, based on estimates for the photometric uncertainty, differential reddening, binarity, and variability, indicate that these uncertainties introduce an age spread of ∼0.15 dex. The intrinsic age spread in the cluster is ∼0.2 dex. The fraction of young stellar objects surrounded by disks is ∼35%. The K -band luminosity function of Stock 8 is similar to that of the Trapezium cluster. The initial mass function (IMF) of Stock 8 has a Salpeter-like slope at >0.5 M {sub ⊙} and flattens and peaks at ∼0.4 M {sub ⊙}, below which it declines into the substellar regime. Although Stock 8 is surrounded by several massive stars, there seems to be no severe environmental effect in the form of the IMF due to the proximity of massive stars around the cluster.« less

  15. An Examination of Government Relations Offices and State Funding

    ERIC Educational Resources Information Center

    Brumfield, Randall W.; Miller, Michael T.; Miles, Jennifer M.

    2009-01-01

    With soaring uncertainty surrounding the financing of public higher education, institutions are faced with developing strategies that will enable them to effectively compete for state funding. One component to cultivating resources and relationships for colleges and universities are through government relations organizations. Utilized for…

  16. The dynamics of local quantum uncertainty and trace distance discord for two-qubit X states under decoherence: a comparative study

    NASA Astrophysics Data System (ADS)

    Slaoui, A.; Daoud, M.; Laamara, R. Ahl

    2018-07-01

    We employ the concepts of local quantum uncertainty and geometric quantum discord based on the trace norm to investigate the environmental effects on quantum correlations of two bipartite quantum systems. The first one concerns a two-qubit system coupled with two independent bosonic reservoirs. We show that the trace discord exhibits frozen phenomenon contrarily to local quantum uncertainty. The second scenario deals with a two-level system, initially prepared in a separable state, interacting with a quantized electromagnetic radiation. Our results show that there exists an exchange of quantum correlations between the two-level system and its surrounding which is responsible for the revival phenomenon of non-classical correlations.

  17. Harnessing the uncertainty monster: Putting quantitative constraints on the intergenerational social discount rate

    NASA Astrophysics Data System (ADS)

    Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.

    2017-09-01

    There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.

  18. Cost-effectiveness of a complex workplace dietary intervention: an economic evaluation of the Food Choice at Work study

    PubMed Central

    Fitzgerald, Sarah; Murphy, Aileen; Kirby, Ann; Geaney, Fiona; Perry, Ivan J

    2018-01-01

    Objective To evaluate the costs, benefits and cost-effectiveness of complex workplace dietary interventions, involving nutrition education and system-level dietary modification, from the perspective of healthcare providers and employers. Design Single-study economic evaluation of a cluster-controlled trial (Food Choice at Work (FCW) study) with 1-year follow-up. Setting Four multinational manufacturing workplaces in Cork, Ireland. Participants 517 randomly selected employees (18–65 years) from four workplaces. Interventions Cost data were obtained from the FCW study. Nutrition education included individual nutrition consultations, nutrition information (traffic light menu labelling, posters, leaflets and emails) and presentations. System-level dietary modification included menu modification (restriction of fat, sugar and salt), increase in fibre, fruit discounts, strategic positioning of healthier alternatives and portion size control. The combined intervention included nutrition education and system-level dietary modification. No intervention was implemented in the control. Outcomes The primary outcome was an improvement in health-related quality of life, measured using the EuroQoL 5 Dimensions 5 Levels questionnaire. The secondary outcome measure was reduction in absenteeism, which is measured in monetary amounts. Probabilistic sensitivity analysis (Monte Carlo simulation) assessed parameter uncertainty. Results The system-level intervention dominated the education and combined interventions. When compared with the control, the incremental cost-effectiveness ratio (€101.37/quality-adjusted life-year) is less than the nationally accepted ceiling ratio, so the system-level intervention can be considered cost-effective. The cost-effectiveness acceptability curve indicates there is some decision uncertainty surrounding this, arising from uncertainty surrounding the differences in effectiveness. These results are reiterated when the secondary outcome measure is considered in a cost–benefit analysis, whereby the system-level intervention yields the highest net benefit (€56.56 per employee). Conclusions System-level dietary modification alone offers the most value per improving employee health-related quality of life and generating net benefit for employers by reducing absenteeism. While system-level dietary modification strategies are potentially sustainable obesity prevention interventions, future research should include long-term outcomes to determine if improvements in outcomes persist. Trial registration number ISRCTN35108237; Post-results. PMID:29502090

  19. CLIMATE CHANGE IN THAILAND AND ITS POTENTIAL IMPACT ON RICE YIELD

    EPA Science Inventory

    Because of the uncertainties surrounding prediction of climate change, it is common to employ climate scenarios to estimate its impacts on a system. Climate scenarios are sets of climatic perturbations used with models to test system sensitivity to projected changes. In this stud...

  20. Early Yields of Biomass Plantations in the North-Central U.S.

    Treesearch

    Edward Hansen

    1990-01-01

    A network of hybrid poplar short-rotation plantations was established across the north-central region of the U.S. during 1986-1988. This paper documents the greater than expected early yields from these plantations and dicusses potential yields and uncertainties surrounding potential yield estimates.

  1. Structural Abnormalities and Learning Impairments Induced by Low Level Thyroid Hormone Insufficiency: A Cross-Fostering Study

    EPA Science Inventory

    Severe reductions in thyroid hormones (TH) during development alter brain structure and impair learning. Uncertainty surrounds both the impact oflower levels of TH disruption and the sensitivity of available metrics to detect neurodevelopmental deficits of this disruption. We ha...

  2. Absolute near-infrared refractometry with a calibrated tilted fiber Bragg grating.

    PubMed

    Zhou, Wenjun; Mandia, David J; Barry, Seán T; Albert, Jacques

    2015-04-15

    The absolute refractive indices (RIs) of water and other liquids are determined with an uncertainty of ±0.001 at near-infrared wavelengths by using the tilted fiber Bragg grating (TFBG) cladding mode resonances of a standard single-mode fiber to measure the critical angle for total internal reflection at the interface between the fiber and its surroundings. The necessary condition to obtain absolute RIs (instead of measuring RI changes) is a thorough characterization of the dispersion of the core mode effective index of the TFBG across the full range of its cladding mode resonance spectrum. This technique is shown to be competitive with the best available measurements of the RIs of water and NaCl solutions at wavelengths in the vicinity of 1550 nm.

  3. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  4. Economic analysis of fuel treatments

    Treesearch

    D. Evan Mercer; Jeffrey P. Prestemon

    2012-01-01

    The economics of wildfire is complicated because wildfire behavior depends on the spatial and temporal scale at which management decisions made, and because of uncertainties surrounding the results of management actions. Like the wildfire processes they seek to manage, interventions through fire prevention programs, suppression, and fuels management are scale dependent...

  5. Learning That Makes a Difference: Pedagogy and Practice for Learning Abroad

    ERIC Educational Resources Information Center

    Benham Rennick, Joanne

    2015-01-01

    Society faces significant new challenges surrounding issues in human health; global security; environmental devastation; human rights violations; economic uncertainty; population explosion and regression; recognition of diversity, difference and special populations at home and abroad. In light of these challenges, there is a great opportunity, and…

  6. Deliberating International Science Policy Controversies: Uncertainty and AIDS in South Africa

    ERIC Educational Resources Information Center

    Paroske, Marcus

    2009-01-01

    International science policy controversies involve disputes over cultural differences in the assessment of knowledge claims and competing visions of the policy-making process between different nations. This essay analyzes these dynamics in the recent controversy surrounding AIDS policy in South Africa. It develops the notion of an epistemological…

  7. Categorical Biases in Spatial Memory: The Role of Certainty

    ERIC Educational Resources Information Center

    Holden, Mark P.; Newcombe, Nora S.; Shipley, Thomas F.

    2015-01-01

    Memories for spatial locations often show systematic errors toward the central value of the surrounding region. The Category Adjustment (CA) model suggests that this bias is due to a Bayesian combination of categorical and metric information, which offers an optimal solution under conditions of uncertainty (Huttenlocher, Hedges, & Duncan,…

  8. Examining Punishment and Discipline: Defending the Use of Punishment by Coaches

    ERIC Educational Resources Information Center

    Seifried, Chad

    2008-01-01

    Confusion, uncertainty, and debate often surround the terms "discipline" and "punishment" because scholars fail to publicize that they possess distinctive meanings. This article differentiates punishment from discipline and attempts to present some rationale supporting its use, especially corporal punishment, in a sport setting from a coaching…

  9. Knowing a winning business idea when you see one.

    PubMed

    Kim, W C; Mauborgne, R

    2000-01-01

    Identifying which business ideas have real commercial potential is fraught with uncertainty, and even the most admired companies have stumbled. It's not as if they don't know what the challenges of innovation are. A new product has to offer customers exceptional utility at an attractive price, and the company must be able to deliver it at a tidy profit. But the uncertainties surrounding innovation are so great that even the most insightful managers have a hard time evaluating the commercial readiness of new business ideas. In this article, W. Chan Kim and Renée Mauborgne introduce three tools that managers can use to help strip away some of that uncertainty. The first tool, "the buyer utility map," indicates how likely it is that customers will be attracted to a new business idea. The second, "the price corridor of the mass," identifies what price will unlock the greatest number of customers. And the third tool, "the business model guide," offers a framework for figuring out whether and how a company can profitably deliver the new idea at the targeted price. Applying the tools, though, is not the end of the story. Many innovations have to overcome adoption hurdles--strong resistance from stakeholders inside and outside the company. Often overlooked in the planning process, adoption hurdles can make or break the commercial viability of even the most powerful new ideas. The authors conclude by discussing how managers can head off negative reactions from stakeholders.

  10. Development of perspective-based water management strategies for the Rhine and Meuse basins.

    PubMed

    van Deursen, W P A; Middelkoop, H

    2005-01-01

    Water management is surrounded by uncertainties. Water management thus has to answer the question: given the uncertainties, what is the best management strategy? This paper describes the application of the perspectives method on water management in the Rhine and Meuse basins. In the perspectives method, a structured framework to analyse water management strategies under uncertainty is provided. Various strategies are clustered in perspectives according to their underlying assumptions. This framework allows for an analysis of current water management strategies, but also allows for evaluation of the robustness of proposed future water strategies. It becomes clear that no water management strategy is superior to the others, but that inherent choices on risk acceptance and costs make a real political dilemma which will not be solved by further optimisation.

  11. Beyond the EDGE with EDAM: Prioritising British Plant Species According to Evolutionary Distinctiveness, and Accuracy and Magnitude of Decline

    PubMed Central

    Pearse, William D.; Chase, Mark W.; Crawley, Michael J.; Dolphin, Konrad; Fay, Michael F.; Joseph, Jeffrey A.; Powney, Gary; Preston, Chris D.; Rapacciuolo, Giovanni; Roy, David B.; Purvis, Andy

    2015-01-01

    Conservation biologists have only finite resources, and so must prioritise some species over others. The EDGE-listing approach ranks species according to their combined evolutionary distinctiveness and degree of threat, but ignores the uncertainty surrounding both threat and evolutionary distinctiveness. We develop a new family of measures for species, which we name EDAM, that incorporates evolutionary distinctiveness, the magnitude of decline, and the accuracy with which decline can be predicted. Further, we show how the method can be extended to explore phyogenetic uncertainty. Using the vascular plants of Britain as a case study, we find that the various EDAM measures emphasise different species and parts of Britain, and that phylogenetic uncertainty can strongly affect the prioritisation scores of some species. PMID:26018568

  12. Spreading Ebola Panic: Newspaper and Social Media Coverage of the 2014 Ebola Health Crisis.

    PubMed

    Kilgo, Danielle K; Yoo, Joseph; Johnson, Thomas J

    2018-02-23

    During times of hot crises, traditional news organizations have historically contributed to public fear and panic by emphasizing risks and uncertainties. The degree to which digital and social media platforms contribute to this panic is essential to consider in the new media landscape. This research examines news coverage of the 2014 Ebola crisis, exploring differences in presentation between newspaper coverage and news shared on the social news platform Reddit. Results suggest that news shared on Reddit amplified panic and uncertainty surrounding Ebola, while traditional newspaper coverage was significantly less likely to produce panic-inducing coverage.

  13. Coastal barrier stratigraphy for Holocene high-resolution sea-level reconstruction

    PubMed Central

    Costas, Susana; Ferreira, Óscar; Plomaritis, Theocharis A.; Leorri, Eduardo

    2016-01-01

    The uncertainties surrounding present and future sea-level rise have revived the debate around sea-level changes through the deglaciation and mid- to late Holocene, from which arises a need for high-quality reconstructions of regional sea level. Here, we explore the stratigraphy of a sandy barrier to identify the best sea-level indicators and provide a new sea-level reconstruction for the central Portuguese coast over the past 6.5 ka. The selected indicators represent morphological features extracted from coastal barrier stratigraphy, beach berm and dune-beach contact. These features were mapped from high-resolution ground penetrating radar images of the subsurface and transformed into sea-level indicators through comparison with modern analogs and a chronology based on optically stimulated luminescence ages. Our reconstructions document a continuous but slow sea-level rise after 6.5 ka with an accumulated change in elevation of about 2 m. In the context of SW Europe, our results show good agreement with previous studies, including the Tagus isostatic model, with minor discrepancies that demand further improvement of regional models. This work reinforces the potential of barrier indicators to accurately reconstruct high-resolution mid- to late Holocene sea-level changes through simple approaches. PMID:27929122

  14. Coastal barrier stratigraphy for Holocene high-resolution sea-level reconstruction.

    PubMed

    Costas, Susana; Ferreira, Óscar; Plomaritis, Theocharis A; Leorri, Eduardo

    2016-12-08

    The uncertainties surrounding present and future sea-level rise have revived the debate around sea-level changes through the deglaciation and mid- to late Holocene, from which arises a need for high-quality reconstructions of regional sea level. Here, we explore the stratigraphy of a sandy barrier to identify the best sea-level indicators and provide a new sea-level reconstruction for the central Portuguese coast over the past 6.5 ka. The selected indicators represent morphological features extracted from coastal barrier stratigraphy, beach berm and dune-beach contact. These features were mapped from high-resolution ground penetrating radar images of the subsurface and transformed into sea-level indicators through comparison with modern analogs and a chronology based on optically stimulated luminescence ages. Our reconstructions document a continuous but slow sea-level rise after 6.5 ka with an accumulated change in elevation of about 2 m. In the context of SW Europe, our results show good agreement with previous studies, including the Tagus isostatic model, with minor discrepancies that demand further improvement of regional models. This work reinforces the potential of barrier indicators to accurately reconstruct high-resolution mid- to late Holocene sea-level changes through simple approaches.

  15. Incorporating anthropogenic influences into fire probability models: Effects of development and climate change on fire activity in California

    NASA Astrophysics Data System (ADS)

    Mann, M.; Moritz, M.; Batllori, E.; Waller, E.; Krawchuk, M.; Berck, P.

    2014-12-01

    The costly interactions between humans and natural fire regimes throughout California demonstrate the need to understand the uncertainties surrounding wildfire, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires. Models estimate an increase in fire occurrence between nine and fifty-three percent by the end of the century. Our goal is to assess the role of uncertainty in climate and anthropogenic influences on the state's fire regime from 2000-2050. We develop an empirical model that integrates novel information about the distribution and characteristics of future plant communities without assuming a particular distribution, and improve on previous efforts by integrating dynamic estimates of population density at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of the total fire count, and that further housing development will incite or suppress additional fires according to their intensity. We also find that the total area burned is likely to increase but at a slower than historical rate. Previous findings of substantially increased numbers of fires may be tied to the assumption of static fuel loadings, and the use of proxy variables not relevant to plant community distributions. We also find considerable agreement between GFDL and PCM model A2 runs, with decreasing fire counts expected only in areas of coastal influence below San Francisco and above Los Angeles. Due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid deserts of the inland south. The broad shifts of wildfire between California's climatic regions forecast in this study point to dramatic shifts in the pressures plant and human communities will face by midcentury. The information provided by this study reduces the level of uncertainty surrounding the influence that natural and anthropogenic systems have on wildfire.

  16. Linking the M&Rfi Weather Generator with Agrometeorological Models

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin; Trnka, Miroslav

    2015-04-01

    Realistic meteorological inputs (representing the present and/or future climates) for the agrometeorological model simulations are often produced by stochastic weather generators (WGs). This contribution presents some methodological issues and results obtained in our recent experiments. We also address selected questions raised in the synopsis of this session. The input meteorological time series for our experiments are produced by the parametric single site weather generator (WG) Marfi, which is calibrated from the available observational data (or interpolated from surrounding stations). To produce meteorological series representing the future climate, the WG parameters are modified by climate change scenarios, which are prepared by the pattern scaling method: the standardised scenarios derived from Global or Regional Climate Models are multiplied by the change in global mean temperature (ΔTG) determined by the simple climate model MAGICC. The presentation will address following questions: (i) The dependence of the quality of the synthetic weather series and impact results on the WG settings. An emphasis will be put on an effect of conditioning the daily WG on monthly WG (presently being one of our hot topics), which aims at improvement of the reproduction of the low-frequency weather variability. Comparison of results obtained with various WG settings is made in terms of climatic and agroclimatic indices (including extreme temperature and precipitation characteristics and drought indices). (ii) Our methodology accounts for the uncertainties coming from various sources. We will show how the climate change impact results are affected by 1. uncertainty in climate modelling, 2. uncertainty in ΔTG, and 3. uncertainty related to the complexity of the climate change scenario (focusing on an effect of inclusion of changes in variability into the climate change scenarios). Acknowledgements: This study was funded by project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248. The weather generator is being developed within the frame of WG4VALUE project (LD12029), which is supported by Ministry of Education, Youth and Sports and linked to the COST action ES1102 VALUE.

  17. Resolved Millimeter Observations of the HR 8799 Debris Disk

    NASA Astrophysics Data System (ADS)

    Wilner, David J.; MacGregor, Meredith A.; Andrews, Sean M.; Hughes, A. Meredith; Matthews, Brenda; Su, Kate

    2018-03-01

    We present 1.3 mm observations of the debris disk surrounding the HR 8799 multi-planet system from the Submillimeter Array to complement archival ALMA observations that spatially filtered away the bulk of the emission. The image morphology at 3.″8 (150 au) resolution indicates an optically thin circumstellar belt, which we associate with a population of dust-producing planetesimals within the debris disk. The interferometric visibilities are fit well by an axisymmetric radial power-law model characterized by a broad width, ΔR/R ≳ 1. The belt inclination and orientation parameters are consistent with the planet orbital parameters within the mutual uncertainties. The models constrain the radial location of the inner edge of the belt to {R}in}={104}-12+8 au. In a simple scenario where the chaotic zone of the outermost planet b truncates the planetesimal distribution, this inner edge location translates into a constraint on the planet b mass of {M}pl}={5.8}-3.1+7.9 M Jup. This mass estimate is consistent with infrared observations of the planet luminosity and standard hot-start evolutionary models, with the uncertainties allowing for a range of initial conditions. We also present new 9 mm observations of the debris disk from the Very Large Array and determine a millimeter spectral index of 2.41 ± 0.17. This value is typical of debris disks and indicates a power-law index of the grain size distribution q = 3.27 ± 0.10, close to predictions for a classical collisional cascade.

  18. Fault parameter constraints using relocated earthquakes: A validation of first-motion focal-mechanism data

    USGS Publications Warehouse

    Kilb, Debi; Hardebeck, J.L.

    2006-01-01

    We estimate the strike and dip of three California fault segments (Calaveras, Sargent, and a portion of the San Andreas near San Jaun Bautistia) based on principle component analysis of accurately located microearthquakes. We compare these fault orientations with two different first-motion focal mechanism catalogs: the Northern California Earthquake Data Center (NCEDC) catalog, calculated using the FPFIT algorithm (Reasenberg and Oppenheimer, 1985), and a catalog created using the HASH algorithm that tests mechanism stability relative to seismic velocity model variations and earthquake location (Hardebeck and Shearer, 2002). We assume any disagreement (misfit >30° in strike, dip, or rake) indicates inaccurate focal mechanisms in the catalogs. With this assumption, we can quantify the parameters that identify the most optimally constrained focal mechanisms. For the NCEDC/FPFIT catalogs, we find that the best quantitative discriminator of quality focal mechanisms is the station distribution ratio (STDR) parameter, an indicator of how the stations are distributed about the focal sphere. Requiring STDR > 0.65 increases the acceptable mechanisms from 34%–37% to 63%–68%. This suggests stations should be uniformly distributed surrounding, rather than aligning, known fault traces. For the HASH catalogs, the fault plane uncertainty (FPU) parameter is the best discriminator, increasing the percent of acceptable mechanisms from 63%–78% to 81%–83% when FPU ≤ 35°. The overall higher percentage of acceptable mechanisms and the usefulness of the formal uncertainty in identifying quality mechanisms validate the HASH approach of testing for mechanism stability.

  19. Invasive perennial grasses in Quercus garryana meadows of southwestern British Columbia: prospects for restoration

    Treesearch

    Andrew MacDougall

    2002-01-01

    Garry oak (Quercus garryana) meadows of the Pacific Northwest are heavily invaded but the dynamics surrounding this ecosystem transformation are poorly understood. Of particular uncertainty is the role of the invasive species in structuring the community, and the potential stability of this invasive-dominated system when disturbed. Clarifying such...

  20. Estimation of the Prevalence of Autism Spectrum Disorder in South Korea, Revisited

    ERIC Educational Resources Information Center

    Pantelis, Peter C.; Kennedy, Daniel P.

    2016-01-01

    Two-phase designs in epidemiological studies of autism prevalence introduce methodological complications that can severely limit the precision of resulting estimates. If the assumptions used to derive the prevalence estimate are invalid or if the uncertainty surrounding these assumptions is not properly accounted for in the statistical inference…

  1. Synthesis of common management concerns associated with dam removal

    Treesearch

    Desirée D. Tullos; Mathias J. Collins; J. Ryan Bellmore; Jennifer A. Bountry; Patrick J. Connolly; Patrick B. Shafroth; Andrew C. Wilcox

    2016-01-01

    Managers make decisions regarding if and how to remove dams in spite of uncertainty surrounding physical and ecological responses, and stakeholders often raise concerns about certain negative effects, regardless of whether these concerns are warranted at a particular site. We used a dam-removal science database supplemented with other information sources to explore...

  2. Topographic forcing and related uncertainties on glacier surface energy balance in High Mountain Asia

    NASA Astrophysics Data System (ADS)

    Olson, M.; Rupper, S.; Shean, D. E.

    2017-12-01

    Topography directly influences the amount of global radiation, as well as other key energy flux terms, arriving on a glacier surface. This is particularly important in regions of variable and complex topography such as High Mountain Asia (HMA). In this region surface energy and mass balance estimates often rely heavily on modeling, and thus require accurate accounting of topography through available remote sensing platforms. Our previous work shows that topographic shading from surrounding terrain can alter the mean daily potential direct shortwave radiation by upwards of 20% for some valley glaciers. In this work, we find in regions of high topographic relief that shading frequently dominates in the ablation zone rather than the accumulation zone, contrary to the findings of some previous studies. This however, is largely dependent on the valley aspect and relative relief of nearby terrain. In addition, we examine the impact of topography, primarily topographic shading, on components of surface energy balance for a large sample of glaciers across different regions in HMA. Our results show that while the impact of topographic shading is highly variable throughout HMA, the magnitude of influence can often be predicted based on simple characteristics such as latitude, valley aspect, and orientation of the immediate surrounding topography. We also explore the uncertainty in topographic shading and in calculated surface energy due to the spatial resolution and accuracy of DEMs. In particular, we compare the shading and energy balance results utilizing a suite of DEMs, including 2 m, 8 m, and 30 m World View DEMs, 30 m ASTER GDEM, 30 m SRTM DEM, and 30 m ALOS DEM. These results will help us improve glacier energy and mass balance modeling accuracy, and demonstrate limitations and uncertainties when modeling changes in surface energy fluxes due to surrounding topography for mountain glaciers.

  3. Sea-level projections representing the deeply uncertain contribution of the West Antarctic ice sheet.

    PubMed

    Bakker, Alexander M R; Wong, Tony E; Ruckert, Kelsey L; Keller, Klaus

    2017-06-20

    There is a growing awareness that uncertainties surrounding future sea-level projections may be much larger than typically perceived. Recently published projections appear widely divergent and highly sensitive to non-trivial model choices . Moreover, the West Antarctic ice sheet (WAIS) may be much less stable than previous believed, enabling a rapid disintegration. Here, we present a set of probabilistic sea-level projections that approximates the deeply uncertain WAIS contributions. The projections aim to inform robust decisions by clarifying the sensitivity to non-trivial or controversial assumptions. We show that the deeply uncertain WAIS contribution can dominate other uncertainties within decades. These deep uncertainties call for the development of robust adaptive strategies. These decision-making needs, in turn, require mission-oriented basic science, for example about potential signposts and the maximum rate of WAIS-induced sea-level changes.

  4. Time-dependent seismic hazard analysis for the Greater Tehran and surrounding areas

    NASA Astrophysics Data System (ADS)

    Jalalalhosseini, Seyed Mostafa; Zafarani, Hamid; Zare, Mehdi

    2018-01-01

    This study presents a time-dependent approach for seismic hazard in Tehran and surrounding areas. Hazard is evaluated by combining background seismic activity, and larger earthquakes may emanate from fault segments. Using available historical and paleoseismological data or empirical relation, the recurrence time and maximum magnitude of characteristic earthquakes for the major faults have been explored. The Brownian passage time (BPT) distribution has been used to calculate equivalent fictitious seismicity rate for major faults in the region. To include ground motion uncertainty, a logic tree and five ground motion prediction equations have been selected based on their applicability in the region. Finally, hazard maps have been presented.

  5. Highly Enriched Uranium Metal Cylinders Surrounded by Various Reflector Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard Jones; J. Blair Briggs; Leland Monteirth

    A series of experiments was performed at Los Alamos Scientific Laboratory in 1958 to determine critical masses of cylinders of Oralloy (Oy) reflected by a number of materials. The experiments were all performed on the Comet Universal Critical Assembly Machine, and consisted of discs of highly enriched uranium (93.3 wt.% 235U) reflected by half-inch and one-inch-thick cylindrical shells of various reflector materials. The experiments were performed by members of Group N-2, particularly K. W. Gallup, G. E. Hansen, H. C. Paxton, and R. H. White. This experiment was intended to ascertain critical masses for criticality safety purposes, as well asmore » to compare neutron transport cross sections to those obtained from danger coefficient measurements with the Topsy Oralloy-Tuballoy reflected and Godiva unreflected critical assemblies. The reflector materials examined in this series of experiments are as follows: magnesium, titanium, aluminum, graphite, mild steel, nickel, copper, cobalt, molybdenum, natural uranium, tungsten, beryllium, aluminum oxide, molybdenum carbide, and polythene (polyethylene). Also included are two special configurations of composite beryllium and iron reflectors. Analyses were performed in which uncertainty associated with six different parameters was evaluated; namely, extrapolation to the uranium critical mass, uranium density, 235U enrichment, reflector density, reflector thickness, and reflector impurities. In addition to the idealizations made by the experimenters (removal of the platen and diaphragm), two simplifications were also made to the benchmark models that resulted in a small bias and additional uncertainty. First of all, since impurities in core and reflector materials are only estimated, they are not included in the benchmark models. Secondly, the room, support structure, and other possible surrounding equipment were not included in the model. Bias values that result from these two simplifications were determined and associated uncertainty in the bias values were included in the overall uncertainty in benchmark keff values. Bias values were very small, ranging from 0.0004 ?k low to 0.0007 ?k low. Overall uncertainties range from ? 0.0018 to ? 0.0030. Major contributors to the overall uncertainty include uncertainty in the extrapolation to the uranium critical mass and the uranium density. Results are summarized in Figure 1. Figure 1. Experimental, Benchmark-Model, and MCNP/KENO Calculated Results The 32 configurations described and evaluated under ICSBEP Identifier HEU-MET-FAST-084 are judged to be acceptable for use as criticality safety benchmark experiments and should be valuable integral benchmarks for nuclear data testing of the various reflector materials. Details of the benchmark models, uncertainty analyses, and final results are given in this paper.« less

  6. Reactive transport modeling at uranium in situ recovery sites: uncertainties in uranium sorption on iron hydroxides

    USGS Publications Warehouse

    Johnson, Raymond H.; Tutu, Hlanganani; Brown, Adrian; Figueroa, Linda; Wolkersdorfer, Christian

    2013-01-01

    Geochemical changes that can occur down gradient from uranium in situ recovery (ISR) sites are important for various stakeholders to understand when evaluating potential effects on surrounding groundwater quality. If down gradient solid-phase material consists of sandstone with iron hydroxide coatings (no pyrite or organic carbon), sorption of uranium on iron hydroxides can control uranium mobility. Using one-dimensional reactive transport models with PHREEQC, two different geochemical databases, and various geochemical parameters, the uncertainties in uranium sorption on iron hydroxides are evaluated, because these oxidized zones create a greater risk for future uranium transport than fully reduced zones where uranium generally precipitates.

  7. Folic Acid Food Fortification—Its History, Effect, Concerns, and Future Directions

    PubMed Central

    Crider, Krista S.; Bailey, Lynn B.; Berry, Robert J.

    2011-01-01

    Periconceptional intake of folic acid is known to reduce a woman’s risk of having an infant affected by a neural tube birth defect (NTD). National programs to mandate fortification of food with folic acid have reduced the prevalence of NTDs worldwide. Uncertainty surrounding possible unintended consequences has led to concerns about higher folic acid intake and food fortification programs. This uncertainty emphasizes the need to continually monitor fortification programs for accurate measures of their effect and the ability to address concerns as they arise. This review highlights the history, effect, concerns, and future directions of folic acid food fortification programs. PMID:22254102

  8. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  9. Lecture Notes on Criticality Safety Validation Using MCNP & Whisper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – C k's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usagemore » are discussed.« less

  10. Stick or twist? Career decision-making during contractual uncertainty for NHS junior doctors.

    PubMed

    Spooner, S; Gibson, Jon; Rigby, Dan; Sutton, Matt; Pearson, Emma; Checkland, Kath

    2017-01-25

    To examine the extent, and nature, of impact on junior doctors' career decisions, of a proposed new contract and the uncertainty surrounding it. Mixed methods. Online survey exploring: doctors' future training intentions; their preferred specialty training (ST) programmes; whether they intended to proceed immediately to ST; and other plans. Linked qualitative interviews to explore more fully how and why decisions were affected. Doctors (F2s) in second year of Foundation School (FS) Programmes in England. Invitations sent by FSs. Open to all F2s November 2015-February 2016. All FSs represented. Survey completed by 816 F2s. Sample characteristics broadly similar to national F2 cohort. Proportions of doctors intending to proceed to ST posts in the UK, to defer or to exit UK medicine. Proportion of doctors indicating changes in training and career plans as a result of the contract and/or resulting uncertainty. Distribution of changes across training programmes. Explanations of these intentions from interviews and free text comments. Among the responding junior doctors, 20% indicated that issues related to the contract had prompted them to switch specialty and a further 20% had become uncertain about switching specialty. Switching specialty choice was more prevalent among those now choosing a community-based, rather than hospital-based specialty. 30% selecting general practice had switched choice because of the new contract. Interview data suggests that doctors felt they had become less valued or appreciated in the National Health Service and in society more broadly. Doctors reported that contract-related issues have affected their career plans. The most notable effect is a move away from acute to community-based specialities, with the former perceived as more negatively affected by the proposed changes. It is concerning that young doctors feel undervalued, and this requires further investigation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. Stick or twist? Career decision-making during contractual uncertainty for NHS junior doctors

    PubMed Central

    Gibson, Jon; Rigby, Dan; Sutton, Matt; Pearson, Emma; Checkland, Kath

    2017-01-01

    Objectives To examine the extent, and nature, of impact on junior doctors' career decisions, of a proposed new contract and the uncertainty surrounding it. Design Mixed methods. Online survey exploring: doctors' future training intentions; their preferred specialty training (ST) programmes; whether they intended to proceed immediately to ST; and other plans. Linked qualitative interviews to explore more fully how and why decisions were affected. Setting Doctors (F2s) in second year of Foundation School (FS) Programmes in England. Participants Invitations sent by FSs. Open to all F2s November 2015–February 2016. All FSs represented. Survey completed by 816 F2s. Sample characteristics broadly similar to national F2 cohort. Main outcome measures Proportions of doctors intending to proceed to ST posts in the UK, to defer or to exit UK medicine. Proportion of doctors indicating changes in training and career plans as a result of the contract and/or resulting uncertainty. Distribution of changes across training programmes. Explanations of these intentions from interviews and free text comments. Results Among the responding junior doctors, 20% indicated that issues related to the contract had prompted them to switch specialty and a further 20% had become uncertain about switching specialty. Switching specialty choice was more prevalent among those now choosing a community-based, rather than hospital-based specialty. 30% selecting general practice had switched choice because of the new contract. Interview data suggests that doctors felt they had become less valued or appreciated in the National Health Service and in society more broadly. Conclusions Doctors reported that contract-related issues have affected their career plans. The most notable effect is a move away from acute to community-based specialities, with the former perceived as more negatively affected by the proposed changes. It is concerning that young doctors feel undervalued, and this requires further investigation. PMID:28122834

  12. The Future of the Computing Curriculum: How the Computing Curriculum Instills Values and Subjectivity in Young People

    ERIC Educational Resources Information Center

    Wohl, Benjamin S.; Beck, Sophie; Blair, Lynne

    2017-01-01

    In these early stages of implementation of the English computing curriculum policy reforms, there are uncertainties with regards to the intentions of computing to young people. To date, research regarding the English computing curriculum has been mostly concerned with the content of the curriculum, its delivery and surrounding pedagogy. In…

  13. Developing an Intercultural Curriculum within the Context of the Internationalisation of Higher Education: Terminology, Typologies and Power

    ERIC Educational Resources Information Center

    Dunne, Ciaran

    2011-01-01

    Although many academics and policymakers espouse the idea of an intercultural curriculum in principle, the practical implementation of this is problematic for several reasons. Firstly, the ambiguity and uncertainty that often surrounds key concepts complicates the articulation of cogent rationales and goals. Secondly, there may be no clear vision…

  14. Silvicultural decisionmaking in an uncertain climate future: a workshop-based exploration of considerations, strategies, and approaches

    Treesearch

    Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Christopher R. Webster; Brian J. Palik; Mark J. Twery; John B. Bradford; Linda R. Parker; Andrea T. Hille; Sheela M. Johnson

    2011-01-01

    Land managers across the country face the immense challenge of developing and applying appropriate management strategies as forests respond to climate change. We hosted a workshop to explore silvicultural strategies for addressing the uncertainties surrounding climate change and forest response in the northeastern and north-central United States. Outcomes of this...

  15. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  16. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  17. Colorectal cancer screening comparing no screening, immunochemical and guaiac fecal occult blood tests: a cost-effectiveness analysis.

    PubMed

    van Rossum, Leo G M; van Rijn, Anne F; Verbeek, Andre L M; van Oijen, Martijn G H; Laheij, Robert J F; Fockens, Paul; Jansen, Jan B M J; Adang, Eddy M M; Dekker, Evelien

    2011-04-15

    Comparability of cost-effectiveness of colorectal cancer (CRC) screening strategies is limited if heterogeneous study data are combined. We analyzed prospective empirical data from a randomized-controlled trial to compare cost-effectiveness of screening with either one round of immunochemical fecal occult blood testing (I-FOBT; OC-Sensor®), one round of guaiac FOBT (G-FOBT; Hemoccult-II®) or no screening in Dutch aged 50 to 75 years, completed with cancer registry and literature data, from a third-party payer perspective in a Markov model with first- and second-order Monte Carlo simulation. Costs were measured in Euros (€), effects in life-years gained, and both were discounted with 3%. Uncertainty surrounding important parameters was analyzed. I-FOBT dominated the alternatives: after one round of I-FOBT screening, a hypothetical person would on average gain 0.003 life-years and save the health care system €27 compared with G-FOBT and 0.003 life years and €72 compared with no screening. Overall, in 4,460,265 Dutch aged 50-75 years, after one round I-FOBT screening, 13,400 life-years and €320 million would have been saved compared with no screening. I-FOBT also dominated in sensitivity analyses, varying uncertainty surrounding important effect and cost parameters. CRC screening with I-FOBT dominated G-FOBT and no screening with or without accounting for uncertainty. Copyright © 2010 UICC.

  18. Quantifying the Value of Perfect Information in Emergency Vaccination Campaigns.

    PubMed

    Bradbury, Naomi V; Probert, William J M; Shea, Katriona; Runge, Michael C; Fonnesbeck, Christopher J; Keeling, Matt J; Ferrari, Matthew J; Tildesley, Michael J

    2017-02-01

    Foot-and-mouth disease outbreaks in non-endemic countries can lead to large economic costs and livestock losses but the use of vaccination has been contentious, partly due to uncertainty about emergency FMD vaccination. Value of information methods can be applied to disease outbreak problems such as FMD in order to investigate the performance improvement from resolving uncertainties. Here we calculate the expected value of resolving uncertainty about vaccine efficacy, time delay to immunity after vaccination and daily vaccination capacity for a hypothetical FMD outbreak in the UK. If it were possible to resolve all uncertainty prior to the introduction of control, we could expect savings of £55 million in outbreak cost, 221,900 livestock culled and 4.3 days of outbreak duration. All vaccination strategies were found to be preferable to a culling only strategy. However, the optimal vaccination radius was found to be highly dependent upon vaccination capacity for all management objectives. We calculate that by resolving the uncertainty surrounding vaccination capacity we would expect to return over 85% of the above savings, regardless of management objective. It may be possible to resolve uncertainty about daily vaccination capacity before an outbreak, and this would enable decision makers to select the optimal control action via careful contingency planning.

  19. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  20. Quantifying the Value of Perfect Information in Emergency Vaccination Campaigns

    PubMed Central

    Probert, William J. M.; Shea, Katriona; Fonnesbeck, Christopher J.; Ferrari, Matthew J.; Tildesley, Michael J.

    2017-01-01

    Foot-and-mouth disease outbreaks in non-endemic countries can lead to large economic costs and livestock losses but the use of vaccination has been contentious, partly due to uncertainty about emergency FMD vaccination. Value of information methods can be applied to disease outbreak problems such as FMD in order to investigate the performance improvement from resolving uncertainties. Here we calculate the expected value of resolving uncertainty about vaccine efficacy, time delay to immunity after vaccination and daily vaccination capacity for a hypothetical FMD outbreak in the UK. If it were possible to resolve all uncertainty prior to the introduction of control, we could expect savings of £55 million in outbreak cost, 221,900 livestock culled and 4.3 days of outbreak duration. All vaccination strategies were found to be preferable to a culling only strategy. However, the optimal vaccination radius was found to be highly dependent upon vaccination capacity for all management objectives. We calculate that by resolving the uncertainty surrounding vaccination capacity we would expect to return over 85% of the above savings, regardless of management objective. It may be possible to resolve uncertainty about daily vaccination capacity before an outbreak, and this would enable decision makers to select the optimal control action via careful contingency planning. PMID:28207777

  1. Patterns in leaf morphological traits of Chinese woody plants and the application for paleoclimate reconstruction

    NASA Astrophysics Data System (ADS)

    Li, Yaoqi; Wang, Zhiheng

    2017-04-01

    Leaf morphological traits (LMTs) directly influence carbon-uptake and water-loss of plants in different habitats, and hence can be sensitive indicators of plant interaction with climate. The relationships between community-aggregated LMTs and their surrounding climate have been used to reconstruct paleoclimate. However, the uncertainties in its application remain poorly explored. Using distribution maps and LMTs data (leaf margin states, leaf length, leaf width, and length-width product/ratio) of 10480 Chinese woody dicots and dated family-level phylogenies, we demonstrated the variations of LMTs in geographical patterns, and analyzed their relationships with climate across different life-forms (evergreen and deciduous; trees, shrubs and lianas) and species quartiles with different family-ages. Results showed that from southern to northern China, leaves became shorter and narrower, while leaf length-width ratio increased and toothed-margin percentage decreased. Our results revealed great uncertainties in leaf margin-temperature relationships induced by life-form, precipitation and evolutionary history, and suggested that the widely-used method, leaf margin analysis, should be applied cautiously on paleotemperature reconstruction. Differently, mean leaf size responded tightly to spatial variations in annual evapotranspiration (AET) and primary productivity (GPP and NPP), and these relationships remained constant across different life-forms and evolutionary history, suggesting that leaf size could be a useful surrogate for paleo primary productivity.

  2. Carcass Persistence and Detectability: Reducing the Uncertainty Surrounding Wildlife-Vehicle Collision Surveys

    PubMed Central

    Santos-Reis, Margarida; Picanço de Figueiredo, Almir; Bager, Alex; Aguiar, Ludmilla M. S.

    2016-01-01

    Carcass persistence time and detectability are two main sources of uncertainty on roadkill surveys. In this study, we evaluate the influence of these uncertainties on roadkill surveys and estimates. To estimate carcass persistence time, three observers (including the driver) surveyed 114km by car on a monthly basis for two years, searching for wildlife-vehicle collisions (WVC). Each survey consisted of five consecutive days. To estimate carcass detectability, we randomly selected stretches of 500m to be also surveyed on foot by two other observers (total 292 walked stretches, 146 km walked). We expected that body size of the carcass, road type, presence of scavengers and weather conditions to be the main drivers influencing the carcass persistence times, but their relative importance was unknown. We also expected detectability to be highly dependent on body size. Overall, we recorded low median persistence times (one day) and low detectability (<10%) for all vertebrates. The results indicate that body size and landscape cover (as a surrogate of scavengers’ presence) are the major drivers of carcass persistence. Detectability was lower for animals with body mass less than 100g when compared to carcass with higher body mass. We estimated that our recorded mortality rates underestimated actual values of mortality by 2–10 fold. Although persistence times were similar to previous studies, the detectability rates here described are very different from previous studies. The results suggest that detectability is the main source of bias across WVC studies. Therefore, more than persistence times, studies should carefully account for differing detectability when comparing WVC studies. PMID:27806125

  3. Cost-effectiveness of a complex workplace dietary intervention: an economic evaluation of the Food Choice at Work study.

    PubMed

    Fitzgerald, Sarah; Murphy, Aileen; Kirby, Ann; Geaney, Fiona; Perry, Ivan J

    2018-03-03

    To evaluate the costs, benefits and cost-effectiveness of complex workplace dietary interventions, involving nutrition education and system-level dietary modification, from the perspective of healthcare providers and employers. Single-study economic evaluation of a cluster-controlled trial (Food Choice at Work (FCW) study) with 1-year follow-up. Four multinational manufacturing workplaces in Cork, Ireland. 517 randomly selected employees (18-65 years) from four workplaces. Cost data were obtained from the FCW study. Nutrition education included individual nutrition consultations, nutrition information (traffic light menu labelling, posters, leaflets and emails) and presentations. System-level dietary modification included menu modification (restriction of fat, sugar and salt), increase in fibre, fruit discounts, strategic positioning of healthier alternatives and portion size control. The combined intervention included nutrition education and system-level dietary modification. No intervention was implemented in the control. The primary outcome was an improvement in health-related quality of life, measured using the EuroQoL 5 Dimensions 5 Levels questionnaire. The secondary outcome measure was reduction in absenteeism, which is measured in monetary amounts. Probabilistic sensitivity analysis (Monte Carlo simulation) assessed parameter uncertainty. The system-level intervention dominated the education and combined interventions. When compared with the control, the incremental cost-effectiveness ratio (€101.37/quality-adjusted life-year) is less than the nationally accepted ceiling ratio, so the system-level intervention can be considered cost-effective. The cost-effectiveness acceptability curve indicates there is some decision uncertainty surrounding this, arising from uncertainty surrounding the differences in effectiveness. These results are reiterated when the secondary outcome measure is considered in a cost-benefit analysis, whereby the system-level intervention yields the highest net benefit (€56.56 per employee). System-level dietary modification alone offers the most value per improving employee health-related quality of life and generating net benefit for employers by reducing absenteeism. While system-level dietary modification strategies are potentially sustainable obesity prevention interventions, future research should include long-term outcomes to determine if improvements in outcomes persist. ISRCTN35108237; Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to whichmore » the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.« less

  5. When size matters: attention affects performance by contrast or response gain.

    PubMed

    Herrmann, Katrin; Montaser-Kouhsari, Leila; Carrasco, Marisa; Heeger, David J

    2010-12-01

    Covert attention, the selective processing of visual information in the absence of eye movements, improves behavioral performance. We found that attention, both exogenous (involuntary) and endogenous (voluntary), can affect performance by contrast or response gain changes, depending on the stimulus size and the relative size of the attention field. These two variables were manipulated in a cueing task while stimulus contrast was varied. We observed a change in behavioral performance consonant with a change in contrast gain for small stimuli paired with spatial uncertainty and a change in response gain for large stimuli presented at one location (no uncertainty) and surrounded by irrelevant flanking distracters. A complementary neuroimaging experiment revealed that observers' attention fields were wider with than without spatial uncertainty. Our results support important predictions of the normalization model of attention and reconcile previous, seemingly contradictory findings on the effects of visual attention.

  6. Covariance propagation in spectral indices

    DOE PAGES

    Griffin, P. J.

    2015-01-09

    In this study, the dosimetry community has a history of using spectral indices to support neutron spectrum characterization and cross section validation efforts. An important aspect to this type of analysis is the proper consideration of the contribution of the spectrum uncertainty to the total uncertainty in calculated spectral indices (SIs). This study identifies deficiencies in the traditional treatment of the SI uncertainty, provides simple bounds to the spectral component in the SI uncertainty estimates, verifies that these estimates are reflected in actual applications, details a methodology that rigorously captures the spectral contribution to the uncertainty in the SI, andmore » provides quantified examples that demonstrate the importance of the proper treatment the spectral contribution to the uncertainty in the SI.« less

  7. Air Coupled Acoustic Thermography (ACAT) Inspection Technique

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph; Winfree, William P.; Yost, William T.

    2007-01-01

    The scope of this effort is to determine the viability of a new heating technique using a noncontact acoustic excitation source. Because of low coupling between air and the structure, a synchronous detection method is employed. Any reduction in the out of plane stiffness improves the acoustic coupling efficiency and as a result, defective areas have an increase in temperature relative to the surrounding area. Hence a new measurement system, based on air-coupled acoustic energy and synchronous detection is presented. An analytical model of a clamped circular plate is given, experimentally tested, and verified. Repeatability confirms the technique with a measurement uncertainty of plus or minus 6.2 percent. The range of frequencies used was 800-2,000 Hertz. Acoustic excitation and consequent thermal detection of flaws in a helicopter blade is examined and results indicate that air coupled acoustic excitation enables the detection of core damage in sandwich honeycomb structures.

  8. Hindrances to precise recovery of cellular forces in fibrous biopolymer networks.

    PubMed

    Zhang, Yunsong; Feng, Jingchen; Heizler, Shay I; Levine, Herbert

    2018-01-11

    How cells move through the three-dimensional extracellular matrix (ECM) is of increasing interest in attempts to understand important biological processes such as cancer metastasis. Just as in motion on flat surfaces, it is expected that experimental measurements of cell-generated forces will provide valuable information for uncovering the mechanisms of cell migration. However, the recovery of forces in fibrous biopolymer networks may suffer from large errors. Here, within the framework of lattice-based models, we explore possible issues in force recovery by solving the inverse problem: how can one determine the forces cells exert to their surroundings from the deformation of the ECM? Our results indicate that irregular cell traction patterns, the uncertainty of local fiber stiffness, the non-affine nature of ECM deformations and inadequate knowledge of network topology will all prevent the precise force determination. At the end, we discuss possible ways of overcoming these difficulties.

  9. Analyzing the Interprofessional Working of a Home-Based Primary Care Team.

    PubMed

    Smith-Carrier, Tracy; Neysmith, Sheila

    2014-09-01

    Increasingly, interprofessional teams are responsible for providing integrated health care services. Effective teams, however, are not the result of chance but require careful planning and ongoing attention to team processes. Based on a case study involving interviews, participant observation, and a survey, we identified key attributes for effective interprofessional working (IPW) within a home-based primary care (HBPC) setting. Recognizing the importance of a theoretical model that reflects the multidimensional nature of team effectiveness research, we employed the integrated team effectiveness model to analyze our findings. The results indicated that a shared vision, common goals, respect, and trust among team members – as well as processes for ongoing communication, effective leadership, and mechanisms for conflict resolution – are vital in the development of a high-functioning IPW team. The ambiguity and uncertainty surrounding the context of service provision (clients' homes), as well the negotiation of external relationships in the HBPC field, require further investigation.

  10. Pricing and reimbursement frameworks in Central Eastern Europe: a decision tool to support choices.

    PubMed

    Kolasa, Katarzyna; Kalo, Zoltan; Hornby, Edward

    2015-02-01

    Given limited financial resources in the Central Eastern European (CEE) region, challenges in obtaining access to innovative medical technologies are formidable. The objective of this research was to develop a decision tree that supports decision makers and drug manufacturers from CEE region in their search for optimal innovative pricing and reimbursement scheme (IPRSs). A systematic literature review was performed to search for published IPRSs, and then ten experts from the CEE region were interviewed to ascertain their opinions on these schemes. In total, 33 articles representing 46 unique IPRSs were analyzed. Based on our literature review and subsequent expert input, key decision nodes and branches of the decision tree were developed. The results indicate that outcome-based schemes are better suited to deal with uncertainties surrounding cost effectiveness, while non-outcome-based schemes are more appropriate for pricing and budget impact challenges.

  11. Be Careful Where You Smile: Culture Shapes Judgments of Intelligence and Honesty of Smiling Individuals.

    PubMed

    Krys, Kuba; -Melanie Vauclair, C; Capaldi, Colin A; Lun, Vivian Miu-Chi; Bond, Michael Harris; Domínguez-Espinosa, Alejandra; Torres, Claudio; Lipp, Ottmar V; Manickam, L Sam S; Xing, Cai; Antalíková, Radka; Pavlopoulos, Vassilis; Teyssier, Julien; Hur, Taekyun; Hansen, Karolina; Szarota, Piotr; Ahmed, Ramadan A; Burtceva, Eleonora; Chkhaidze, Ana; Cenko, Enila; Denoux, Patrick; Fülöp, Márta; Hassan, Arif; Igbokwe, David O; Işık, İdil; Javangwe, Gwatirera; Malbran, María; Maricchiolo, Fridanna; Mikarsa, Hera; Miles, Lynden K; Nader, Martin; Park, Joonha; Rizwan, Muhammad; Salem, Radwa; Schwarz, Beate; Shah, Irfana; Sun, Chien-Ru; van Tilburg, Wijnand; Wagner, Wolfgang; Wise, Ryan; Yu, Angela Arriola

    Smiling individuals are usually perceived more favorably than non-smiling ones-they are judged as happier, more attractive, competent, and friendly. These seemingly clear and obvious consequences of smiling are assumed to be culturally universal, however most of the psychological research is carried out in WEIRD societies (Western, Educated, Industrialized, Rich, and Democratic) and the influence of culture on social perception of nonverbal behavior is still understudied. Here we show that a smiling individual may be judged as less intelligent than the same non-smiling individual in cultures low on the GLOBE's uncertainty avoidance dimension. Furthermore, we show that corruption at the societal level may undermine the prosocial perception of smiling-in societies with high corruption indicators, trust toward smiling individuals is reduced. This research fosters understanding of the cultural framework surrounding nonverbal communication processes and reveals that in some cultures smiling may lead to negative attributions.

  12. Hindrances to precise recovery of cellular forces in fibrous biopolymer networks

    NASA Astrophysics Data System (ADS)

    Zhang, Yunsong; Feng, Jingchen; Heizler, Shay I.; Levine, Herbert

    2018-03-01

    How cells move through the three-dimensional extracellular matrix (ECM) is of increasing interest in attempts to understand important biological processes such as cancer metastasis. Just as in motion on flat surfaces, it is expected that experimental measurements of cell-generated forces will provide valuable information for uncovering the mechanisms of cell migration. However, the recovery of forces in fibrous biopolymer networks may suffer from large errors. Here, within the framework of lattice-based models, we explore possible issues in force recovery by solving the inverse problem: how can one determine the forces cells exert to their surroundings from the deformation of the ECM? Our results indicate that irregular cell traction patterns, the uncertainty of local fiber stiffness, the non-affine nature of ECM deformations and inadequate knowledge of network topology will all prevent the precise force determination. At the end, we discuss possible ways of overcoming these difficulties.

  13. Helicopter electromagnetic data from Everglades National Park and surrounding areas, Florida: collected 9-14 December 1994

    USGS Publications Warehouse

    Fitterman, David V.; Deszcz-Pan, Maria

    2002-01-01

    This report describes helicopter electromagnetic (HEM) data that were collected over portion of Everglades National Park and surrounding areas in south Florida. The survey was flown 9-14 December 1994. The original data set processed by the contractor, Dighem, are provided as an ASCII, xyz flight-line file. Apparent resistivity grids of the generated from the original data set and JPEG images of these grids are also provided. The data have been corrected by the U.S. Geological Survey to remove the effects of calibration errors and bird-height uncertainty. The corrected data set is included in this report as flight-line data only.

  14. Gap Size Uncertainty Quantification in Advanced Gas Reactor TRISO Fuel Irradiation Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Binh T.; Einerson, Jeffrey J.; Hawkes, Grant L.

    The Advanced Gas Reactor (AGR)-3/4 experiment is the combination of the third and fourth tests conducted within the tristructural isotropic fuel development and qualification research program. The AGR-3/4 test consists of twelve independent capsules containing a fuel stack in the center surrounded by three graphite cylinders and shrouded by a stainless steel shell. This capsule design enables temperature control of both the fuel and the graphite rings by varying the neon/helium gas mixture flowing through the four resulting gaps. Knowledge of fuel and graphite temperatures is crucial for establishing the functional relationship between fission product release and irradiation thermal conditions.more » These temperatures are predicted for each capsule using the commercial finite-element heat transfer code ABAQUS. Uncertainty quantification reveals that the gap size uncertainties are among the dominant factors contributing to predicted temperature uncertainty due to high input sensitivity and uncertainty. Gap size uncertainty originates from the fact that all gap sizes vary with time due to dimensional changes of the fuel compacts and three graphite rings caused by extended exposure to high temperatures and fast neutron irradiation. Gap sizes are estimated using as-fabricated dimensional measurements at the start of irradiation and post irradiation examination dimensional measurements at the end of irradiation. Uncertainties in these measurements provide a basis for quantifying gap size uncertainty. However, lack of gap size measurements during irradiation and lack of knowledge about the dimension change rates lead to gap size modeling assumptions, which could increase gap size uncertainty. In addition, the dimensional measurements are performed at room temperature, and must be corrected to account for thermal expansion of the materials at high irradiation temperatures. Uncertainty in the thermal expansion coefficients for the graphite materials used in the AGR-3/4 capsules also increases gap size uncertainty. This study focuses on analysis of modeling assumptions and uncertainty sources to evaluate their impacts on the gap size uncertainty.« less

  15. Facing Uncertainty: The Role of the M1 Abrams Tank in the U.S. Army of 2015-2025

    DTIC Science & Technology

    2015-06-12

    surrounding the role of the main battle tank has in fact been a defining feature of the Anglo -American attitude since World War I.20 The modern main...training to high-intensity combined arms training. To replicate combat operations in complex terrain, IDF soldiers trained in a mock Arab city in

  16. State of the World 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarthy, D.

    1993-01-01

    State of the World 1993 warns particularly about global decline in food production and rise in poverty. However, other aspects are more positive: governments responding quickly to global environmental concerns such as the ozone hole and CFCs; the Earth Summit at Rio; the possibility we are on the road to a sustainable society. The uncertainty surrounding the issue of global warming is also presented.

  17. The effects of 11 yr of CO2 enrichment on roots in a Florida scrub-oak ecosystem

    Treesearch

    Frank Day; Rachel Schroeder; Daniel Stover; Alisha Brown; John Butnor; John Dilustro; Bruce Hungate; Paul Dijkstra; Benjamin Duval; Troy Seiler; Bert Drake; Ross Hinkle

    2013-01-01

    Uncertainty surrounds belowground plant responses to rising atmospheric CO2 because roots are difficult to measure, requiring frequent monitoring as a result of fine root dynamics and long-term monitoring as a result of sensitivity to resource availability. We report belowground plant responses of a scrub-oak ecosystem in Florida exposed to 11...

  18. Nuclear Winter: Uncertainties Surround the Long-Term Effects of Nuclear War. Report to the Congress.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    Nuclear winter, a term used to describe potential long-term climate and environmental effects of nuclear war, has been a subject of debate and controversy. This report examines and presents scientific and policy implications of nuclear winter. Contents include: (1) an executive summary (highlighting previous and current studies on the topic); (2)…

  19. Health care providers under pressure: making the most of challenging times.

    PubMed

    Davis, Scott B; Robinson, Phillip J

    2010-01-01

    Whether the slowing economic recovery, tight credit markets, increasing costs, or the uncertainty surrounding health care reform, the health care industry faces some sizeable challenges. These factors have put considerable strain on the industry's traditional financing options that the industry has relied on in the past--bonds, banks, finance companies, private equity, venture capital, real estate investment trusts, private philanthropy, and grants. At the same time, providers are dealing with rising costs, lower reimbursement rates, shrinking demand for elective procedures, higher levels of charitable care and bad debt, and increased scrutiny of tax-exempt hospitals. Providers face these challenges against a back ground of uncertainty created by health care reform.

  20. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  1. Adaptation of water resource systems to an uncertain future

    NASA Astrophysics Data System (ADS)

    Walsh, C. L.; Blenkinsop, S.; Fowler, H. J.; Burton, A.; Dawson, R. J.; Glenis, V.; Manning, L. J.; Kilsby, C. G.

    2015-09-01

    Globally, water resources management faces significant challenges from changing climate and growing populations. At local scales, the information provided by climate models is insufficient to support the water sector in making future adaptation decisions. Furthermore, projections of change in local water resources are wrought with uncertainties surrounding natural variability, future greenhouse gas emissions, model structure, population growth and water consumption habits. To analyse the magnitude of these uncertainties, and their implications for local scale water resource planning, we present a top-down approach for testing climate change adaptation options using probabilistic climate scenarios and demand projections. An integrated modelling framework is developed which implements a new, gridded spatial weather generator, coupled with a rainfall-runoff model and water resource management simulation model. We use this to provide projections of the number of days, and associated uncertainty that will require implementation of demand saving measures such as hose pipe bans and drought orders. Results, which are demonstrated for the Thames basin, UK, indicate existing water supplies are sensitive to a changing climate and an increasing population, and that the frequency of severe demand saving measures are projected to increase. Considering both climate projections and population growth the median number of drought order occurrences may increase five-fold. The effectiveness of a range of demand management and supply options have been tested and shown to provide significant benefits in terms of reducing the number of demand saving days. We found that increased supply arising from various adaptation options may compensate for increasingly variable flows; however, without reductions in overall demand for water resources such options will be insufficient on their own to adapt to uncertainties in the projected changes in climate and population. For example, a 30 % reduction in overall demand by 2050 has a greater impact on reducing the frequency of drought orders than any of the individual or combinations of supply options; hence a portfolio of measures are required.

  2. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  3. Where is the carbon? Carbon sequestration potential from private forestland in the Southern United States

    Treesearch

    Christopher S. Galik; Brian C. Murray; D. Evan Mercer

    2013-01-01

    Uncertainty surrounding the future supply of timber in the southern United States prompted the question, “Where is all the wood?” (Cubbage et al. 1995). We ask a similar question about the potential of southern forests to mitigate greenhouse gas (GHG) emissions by sequestering carbon. Because significant carbon sequestration potential occurs on individual nonindustrial...

  4. The Effect of Aging and Attention on Visual Crowding and Surround Suppression of Perceived Contrast Threshold.

    PubMed

    Malavita, Menaka S; Vidyasagar, Trichur R; McKendrick, Allison M

    2017-02-01

    The purpose of this study was to study how, in midperipheral vision, aging affects visual processes that interfere with target detection (crowding and surround suppression) and to determine whether the performance on such tasks are related to visuospatial attention as measured by visual search. We investigated the effect of aging on crowding and suppression in detection of a target in peripheral vision, using different types of flanking stimuli. Both thresholds were also obtained while varying the position of the flanker (placed inside or outside of target, relative to fixation). Crowding thresholds were also estimated with spatial uncertainty (jitter). Additionally, we included a visual search task comprising Gabor stimuli to investigate whether performance is related to top-down attention. Twenty young adults (age, 18-32 years; mean age, 26.1 years; 10 males) and 19 older adults (age, 60-74 years; mean age, 70.3 years; 10 males) participated in the study. Older adults showed more surround suppression than the young (F[1,37] = 4.21; P < 0.05), but crowding was unaffected by age. In the younger group, the position of the flanker influenced the strength of crowding, but not the strength of suppression (F[1,39] = 4.11; P < 0.05). Crowding was not affected by spatial jitter of the stimuli. Neither crowding nor surround suppression was predicted by attentional efficiency measured in the visual search task. There was also no significant correlation between crowding and surround suppression. We show that aging does not affect visual crowding but does increase surround suppression of contrast, suggesting that crowding and surround suppression may be distinct visual phenomena. Furthermore, strengths of crowding and surround suppression did not correlate with each other nor could they be predicted by efficiency of visual search.

  5. The Importance of Model Structure in the Cost-Effectiveness Analysis of Primary Care Interventions for the Management of Hypertension.

    PubMed

    Peñaloza-Ramos, Maria Cristina; Jowett, Sue; Sutton, Andrew John; McManus, Richard J; Barton, Pelham

    2018-03-01

    Management of hypertension can lead to significant reductions in blood pressure, thereby reducing the risk of cardiovascular disease. Modeling the course of cardiovascular disease is not without complications, and uncertainty surrounding the structure of a model will almost always arise once a choice of a model structure is defined. To provide a practical illustration of the impact on the results of cost-effectiveness of changing or adapting model structures in a previously published cost-utility analysis of a primary care intervention for the management of hypertension Targets and Self-Management for the Control of Blood Pressure in Stroke and at Risk Groups (TASMIN-SR). The case study assessed the structural uncertainty arising from model structure and from the exclusion of secondary events. Four alternative model structures were implemented. Long-term cost-effectiveness was estimated and the results compared with those from the TASMIN-SR model. The main cost-effectiveness results obtained in the TASMIN-SR study did not change with the implementation of alternative model structures. Choice of model type was limited to a cohort Markov model, and because of the lack of epidemiological data, only model 4 captured structural uncertainty arising from the exclusion of secondary events in the case study model. The results of this study indicate that the main conclusions drawn from the TASMIN-SR model of cost-effectiveness were robust to changes in model structure and the inclusion of secondary events. Even though one of the models produced results that were different to those of TASMIN-SR, the fact that the main conclusions were identical suggests that a more parsimonious model may have sufficed. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. A Comprehensive Study of Gridding Methods for GPS Horizontal Velocity Fields

    NASA Astrophysics Data System (ADS)

    Wu, Yanqiang; Jiang, Zaisen; Liu, Xiaoxia; Wei, Wenxin; Zhu, Shuang; Zhang, Long; Zou, Zhenyu; Xiong, Xiaohui; Wang, Qixin; Du, Jiliang

    2017-03-01

    Four gridding methods for GPS velocities are compared in terms of their precision, applicability and robustness by analyzing simulated data with uncertainties from 0.0 to ±3.0 mm/a. When the input data are 1° × 1° grid sampled and the uncertainty of the additional error is greater than ±1.0 mm/a, the gridding results show that the least-squares collocation method is highly robust while the robustness of the Kriging method is low. In contrast, the spherical harmonics and the multi-surface function are moderately robust, and the regional singular values for the multi-surface function method and the edge effects for the spherical harmonics method become more significant with increasing uncertainty of the input data. When the input data (with additional errors of ±2.0 mm/a) are decimated by 50% from the 1° × 1° grid data and then erased in three 6° × 12° regions, the gridding results in these three regions indicate that the least-squares collocation and the spherical harmonics methods have good performances, while the multi-surface function and the Kriging methods may lead to singular values. The gridding techniques are also applied to GPS horizontal velocities with an average error of ±0.8 mm/a over the Chinese mainland and the surrounding areas, and the results show that the least-squares collocation method has the best performance, followed by the Kriging and multi-surface function methods. Furthermore, the edge effects of the spherical harmonics method are significantly affected by the sparseness and geometric distribution of the input data. In general, the least-squares collocation method is superior in terms of its robustness, edge effect, error distribution and stability, while the other methods have several positive features.

  7. Present-day velocity field and block kinematics of Tibetan Plateau from GPS measurements

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Qiao, Xuejun; Yang, Shaomin; Wang, Dijin

    2017-02-01

    In this study, we present a new synthesis of GPS velocities for tectonic deformation within the Tibetan Plateau and its surrounding areas, a combined data set of ˜1854 GPS-derived horizontal velocity vectors. Assuming that crustal deformation is localized along major faults, a block modelling approach is employed to interpret the GPS velocity field. We construct a 30-element block model to describe present-day deformation in western China, with half of them located within the Tibetan Plateau, and the remainder located in its surrounding areas. We model the GPS velocities simultaneously for the effects of block rotations and elastic strain induced by the bounding faults. Our model yields a good fit to the GPS data with a mean residual of 1.08 mm a-1 compared to the mean uncertainty of 1.36 mm a-1 for each velocity component, indicating a good agreement between the predicted and observed velocities. The major strike-slip faults such as the Altyn Tagh, Xianshuihe, Kunlun and Haiyuan faults have relatively uniform slip rates in a range of 5-12 mm a-1 along most of their segments, and the estimated fault slip rates agree well with previous geologic and geodetic results. Blocks having significant residuals are located at the southern and southeastern Tibetan Plateau, suggesting complex tectonic settings and further refinement of accurate definition of block geometry in these regions.

  8. Product Carbon Footprints and Their Uncertainties in Comparative Decision Contexts

    PubMed Central

    Dao, Hai M.; Phan, Lam T.; de Snoo, Geert R.

    2015-01-01

    In response to growing awareness of climate change, requests to establish product carbon footprints have been increasing. Product carbon footprints are life cycle assessments restricted to just one impact category, global warming. Product carbon footprint studies generate life cycle inventory results, listing the environmental emissions of greenhouse gases from a product’s lifecycle, and characterize these by their global warming potentials, producing product carbon footprints that are commonly communicated as point values. In the present research we show that the uncertainties surrounding these point values necessitate more sophisticated ways of communicating product carbon footprints, using different sizes of catfish (Pangasius spp.) farms in Vietnam as a case study. As most product carbon footprint studies only have a comparative meaning, we used dependent sampling to produce relative results in order to increase the power for identifying environmentally superior products. We therefore argue that product carbon footprints, supported by quantitative uncertainty estimates, should be used to test hypotheses, rather than to provide point value estimates or plain confidence intervals of products’ environmental performance. PMID:25781175

  9. Product carbon footprints and their uncertainties in comparative decision contexts.

    PubMed

    Henriksson, Patrik J G; Heijungs, Reinout; Dao, Hai M; Phan, Lam T; de Snoo, Geert R; Guinée, Jeroen B

    2015-01-01

    In response to growing awareness of climate change, requests to establish product carbon footprints have been increasing. Product carbon footprints are life cycle assessments restricted to just one impact category, global warming. Product carbon footprint studies generate life cycle inventory results, listing the environmental emissions of greenhouse gases from a product's lifecycle, and characterize these by their global warming potentials, producing product carbon footprints that are commonly communicated as point values. In the present research we show that the uncertainties surrounding these point values necessitate more sophisticated ways of communicating product carbon footprints, using different sizes of catfish (Pangasius spp.) farms in Vietnam as a case study. As most product carbon footprint studies only have a comparative meaning, we used dependent sampling to produce relative results in order to increase the power for identifying environmentally superior products. We therefore argue that product carbon footprints, supported by quantitative uncertainty estimates, should be used to test hypotheses, rather than to provide point value estimates or plain confidence intervals of products' environmental performance.

  10. The use of sequential indicator simulation to characterize geostatistical uncertainty; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It ismore » recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.« less

  11. HEAT - Habitat Evaluation and Assessment Tools for Effective Environmental Evaluations: User’s Guide

    DTIC Science & Technology

    2012-12-01

    surround- ing landscape (e.g., plants , animals, detritus, soil, the atmosphere, etc.) interact through a variety of physical, chemical, and... interactive geographic information system. Ecological Modelling 114:287–304. Ray, N., and M. A. Burgman. 2006. Subjective uncertainties in habitat...environmental impacts on ecological systems at numerous scales with varying degrees of success. Advances in technology have led many agen- cies to automate

  12. Materiality matters: Blurred boundaries and the domestication of functional foods.

    PubMed

    Weiner, Kate; Will, Catherine

    2015-06-01

    Previous scholarship on novel foods, including functional foods, has suggested that they are difficult to categorise for both regulators and users. It is argued that they blur the boundary between 'food' and 'drug' and that uncertainties about the products create 'experimental' or 'restless' approaches to consumption. We investigate these uncertainties drawing on data about the use of functional foods containing phytosterols, which are licensed for sale in the EU for people wishing to reduce their cholesterol. We start from an interest in the products as material objects and their incorporation into everyday practices. We consider the scripts encoded in the physical form of the products through their regulation, production and packaging and find that these scripts shape but do not determine their use. The domestication of phytosterols involves bundling the products together with other objects (pills, supplements, foodstuffs). Considering their incorporation into different systems of objects offers new understandings of the products as foods or drugs. In their accounts of their practices, consumers appear to be relatively untroubled by uncertainties about the character of the products. We conclude that attending to materials and practices offers a productive way to open up and interrogate the idea of categorical uncertainties surrounding new food products.

  13. Incorporating climate change and morphological uncertainty into coastal change hazard assessments

    USGS Publications Warehouse

    Baron, Heather M.; Ruggiero, Peter; Wood, Nathan J.; Harris, Erica L.; Allan, Jonathan; Komar, Paul D.; Corcoran, Patrick

    2015-01-01

    Documented and forecasted trends in rising sea levels and changes in storminess patterns have the potential to increase the frequency, magnitude, and spatial extent of coastal change hazards. To develop realistic adaptation strategies, coastal planners need information about coastal change hazards that recognizes the dynamic temporal and spatial scales of beach morphology, the climate controls on coastal change hazards, and the uncertainties surrounding the drivers and impacts of climate change. We present a probabilistic approach for quantifying and mapping coastal change hazards that incorporates the uncertainty associated with both climate change and morphological variability. To demonstrate the approach, coastal change hazard zones of arbitrary confidence levels are developed for the Tillamook County (State of Oregon, USA) coastline using a suite of simple models and a range of possible climate futures related to wave climate, sea-level rise projections, and the frequency of major El Niño events. Extreme total water levels are more influenced by wave height variability, whereas the magnitude of erosion is more influenced by sea-level rise scenarios. Morphological variability has a stronger influence on the width of coastal hazard zones than the uncertainty associated with the range of climate change scenarios.

  14. Selective and graded coding of reward-uncertainty by neurons in the primate anterodorsal septal region

    PubMed Central

    Monosov, Ilya E.; Hikosaka, Okihide

    2014-01-01

    Natural environments are uncertain. Uncertainty of emotional outcomes can induce anxiety and raise vigilance, promote and signal the opportunity for learning, modulate economic choice, and regulate risk seeking. Here we demonstrate that a subset of neurons in the anterodorsal region of the primate septum (ADS) are primarily devoted to processing uncertainty in a highly specific manner. Those neurons were selectively activated by visual cues indicating probabilistic delivery of reward (e.g. 25%, 50%, 75% reward) and did not respond to cues indicating certain outcomes (0% and 100% reward). The average ADS uncertainty response was graded with the magnitude of reward uncertainty, and selectively signaled uncertainty about rewards rather than punishments. The selective and graded information about reward uncertainty encoded by many neurons in the ADS may underlie uncertainty-modulation of value- and sensorimotor- related areas to regulate goal-directed behavior. PMID:23666181

  15. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    PubMed

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    PubMed

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  17. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  18. To evaluate the effectiveness of health care ethics consultation based on the goals of health care ethics consultation: a prospective cohort study with randomization.

    PubMed

    Chen, Yen-Yuan; Chu, Tzong-Shinn; Kao, Yu-Hui; Tsai, Pi-Ru; Huang, Tien-Shang; Ko, Wen-Je

    2014-01-03

    The growing prevalence of health care ethics consultation (HCEC) services in the U.S. has been accompanied by an increase in calls for accountability and quality assurance, and for the debates surrounding why and how HCEC is evaluated. The objective of this study was to evaluate the effectiveness of HCEC as indicated by several novel outcome measurements in East Asian medical encounters. Patients with medical uncertainty or conflict regarding value-laden issues, and requests made by the attending physicians or nurses for HCEC from December 1, 2009 to April 30, 2012 were randomly assigned to the usual care group (UC group) and the intervention group (HCEC group). The patients in the HCEC group received HCEC conducted by an individual ethics consultant. Data analysis was based on the intention-to-treat principle. Mann-Whitney test and Chi-squared test were used depending on the scale of measurement. Thirty-three patients (53.23%) were randomly assigned to the HCEC group and 29 patients were randomly assigned to the UC group. Among the 33 patients in the HCEC group, two (6.06%) of them ultimately did not receive a HCEC service. Among the 29 patients in the UC group, four (13.79%) of them received a HCEC service. The survival rate at hospital discharge did not differ between the two groups. Patients in the HCEC group showed significant reductions in the entire ICU stay and entire hospital stay. HCEC significantly facilitated achieving the goal of medical care (p < .01). Furthermore, patients in the HCEC group had a shorter ICU stay and shorter hospital stay after the occurrence of medical uncertainty or conflict regarding value-laden issues than those in the UC group. Our findings demonstrated that HCEC were associated with reduced consumption of medical resources as indicated by shorter entire ICU stay, entire hospital stay, and shorter ICU and hospital stay after the occurrence of the medical uncertainty or conflict regarding value-laden issues. This study also showed that HCEC facilitated achieving a consensus regarding the goal of medical care, which conforms to the goal of HCEC.

  19. Contributions of natural and anthropogenic radiative forcing to mass loss of Northern Hemisphere mountain glaciers and quantifying their uncertainties.

    PubMed

    Hirabayashi, Yukiko; Nakano, Kazunari; Zhang, Yong; Watanabe, Satoshi; Tanoue, Masahiro; Kanae, Shinjiro

    2016-07-20

    Observational evidence indicates that a number of glaciers have lost mass in the past. Given that glaciers are highly impacted by the surrounding climate, human-influenced global warming may be partly responsible for mass loss. However, previous research studies have been limited to analyzing the past several decades, and it remains unclear whether past glacier mass losses are within the range of natural internal climate variability. Here, we apply an optimal fingerprinting technique to observed and reconstructed mass losses as well as multi-model general circulation model (GCM) simulations of mountain glacier mass to detect and attribute past glacier mass changes. An 8,800-year control simulation of glaciers enabled us to evaluate detectability. The results indicate that human-induced increases in greenhouse gases have contributed to the decreased area-weighted average masses of 85 analyzed glaciers. The effect was larger than the mass increase caused by natural forcing, although the contributions of natural and anthropogenic forcing to decreases in mass varied at the local scale. We also showed that the detection of anthropogenic or natural influences could not be fully attributed when natural internal climate variability was taken into account.

  20. Contributions of natural and anthropogenic radiative forcing to mass loss of Northern Hemisphere mountain glaciers and quantifying their uncertainties

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Yukiko; Nakano, Kazunari; Zhang, Yong; Watanabe, Satoshi; Tanoue, Masahiro; Kanae, Shinjiro

    2016-07-01

    Observational evidence indicates that a number of glaciers have lost mass in the past. Given that glaciers are highly impacted by the surrounding climate, human-influenced global warming may be partly responsible for mass loss. However, previous research studies have been limited to analyzing the past several decades, and it remains unclear whether past glacier mass losses are within the range of natural internal climate variability. Here, we apply an optimal fingerprinting technique to observed and reconstructed mass losses as well as multi-model general circulation model (GCM) simulations of mountain glacier mass to detect and attribute past glacier mass changes. An 8,800-year control simulation of glaciers enabled us to evaluate detectability. The results indicate that human-induced increases in greenhouse gases have contributed to the decreased area-weighted average masses of 85 analyzed glaciers. The effect was larger than the mass increase caused by natural forcing, although the contributions of natural and anthropogenic forcing to decreases in mass varied at the local scale. We also showed that the detection of anthropogenic or natural influences could not be fully attributed when natural internal climate variability was taken into account.

  1. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  2. The Sapir-Whorf hypothesis and inference under uncertainty.

    PubMed

    Regier, Terry; Xu, Yang

    2017-11-01

    The Sapir-Whorf hypothesis holds that human thought is shaped by language, leading speakers of different languages to think differently. This hypothesis has sparked both enthusiasm and controversy, but despite its prominence it has only occasionally been addressed in computational terms. Recent developments support a view of the Sapir-Whorf hypothesis in terms of probabilistic inference. This view may resolve some of the controversy surrounding the Sapir-Whorf hypothesis, and may help to normalize the hypothesis by linking it to established principles that also explain other phenomena. On this view, effects of language on nonlinguistic cognition or perception reflect standard principles of inference under uncertainty. WIREs Cogn Sci 2017, 8:e1440. doi: 10.1002/wcs.1440 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.

  3. Thermodynamic limitations on the temperature sensitivity of cell-membrane ion channels: Trouble with enthalpy uncertainty

    NASA Astrophysics Data System (ADS)

    Zheltikov, A. M.

    2018-06-01

    Energy exchange between a thermodynamic ensemble of heat- and cold-activated cell-membrane ion channels and the surrounding heat reservoir is shown to impose fundamental limitations on the performance of such channels as temperature-controlled gates for thermal cell activation. Analysis of unavoidable thermodynamic internal-energy fluctuations caused by energy exchange between the ion channels and the heat bath suggests that the resulting enthalpy uncertainty is too high for a robust ion-current gating by a single ion channel, implying that large ensembles of ion channels are needed for thermal cell activation. We argue, based on this thermodynamic analysis, that, had thermosensitive cell-membrane ion channels operated individually, rather than as large ensembles, robust thermal cell activation would have been impossible because of thermodynamic fluctuations.

  4. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  5. Skin friction measurements in high temperature high speed flows

    NASA Technical Reports Server (NTRS)

    Schetz, J. A.; Diller, Thomas E.; Wicks, A. L.

    1992-01-01

    An experimental investigation was conducted to measure skin friction along the chamber walls of supersonic combustors. A direct force measurement device was used to simultaneously measure an axial and transverse component of the small tangential shear force passing over a non-intrusive floating element. The floating head is mounted to a stiff cantilever beam arrangement with deflection due to the flow on the order of 0.00254 mm (0.0001 in.). This allowed the instrument to be a non-nulling type. A second gauge was designed with active cooling of the floating sensor head to eliminate non-uniform temperature effects between the sensor head and the surrounding wall. Samples of measurements made in combustor test facilities at NASA Langley Research Center and at the General Applied Science Laboratory (GASL) are presented. Skin friction coefficients between 0.001 - 0.005 were measured dependent on the facility and measurement location. Analysis of the measurement uncertainties indicate an accuracy to within +/- 10-15 percent of the streamwise component.

  6. Mass-Produced, Assembly-Line Abortion—A Prime Example of Unethical, Unscientific Medicine

    PubMed Central

    Ford, James H.

    1972-01-01

    The incidence of psychologic sequelae associated with abortion cannot be established scientifically, and so continues to be disputed. Since there are no truly scientific criteria on which to make a prediction as to the psychologic outcome, it seems only proper that elective abortion be labeled “experimental,” rather than “therapeutic.” This uncertainty as to therapeutic benefit is compounded by the fact that adequate studies and information about physical sequelae are also lacking. Furthermore, preliminary statistics from the Population Council indicate that the morbidity rate of abortion performed even under proper medical auspices is unacceptably high. Viewed in this light and in relation to our own ethical code, the current practice of performing innumerable, mechanized, elective abortions can only be considered unethical. If it is argued that abortion can be ethically validated merely by surrounding it with the same controls used in other experimental procedures, then the medical profession should insist on such controls forthwith. PMID:4638411

  7. Summary of panel session 3 -- Environmental issues affecting CCT deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hausker, K.

    1997-12-31

    The panelists discussed a variety of environmental issues that affect CCT deployment, and more broadly speaking, power development in general. The issues were both international and domestic in nature. The author summarizes the issues discussed. A summary is also presented which highlights ideas from the panelists that could be characterized as solutions to the demand for improved environmental performance and the surrounding uncertainties. The author offers some personal comments and observations.

  8. Evaluation of Plume Divergence and Facility Effects on Far-Field Faraday Probe Current Density Profiles

    DTIC Science & Technology

    2009-09-01

    elevated background pressure, compared nude Faraday probe designs, and evaluated design modifications to minimize uncertainty due to charge exchange...evaluated Faraday probe design and facility background pressure on collected ion current. A comparison of two nude Faraday probe designs concluded...140.5 Plasma potential in the region surrounding a nude Faraday probe has been measured to study the possibility of probe bias voltage acting as a

  9. Barriers to regaining control within a constructivist grounded theory of family resilience in ICU: Living with uncertainty.

    PubMed

    Wong, Pauline; Liamputtong, Pranee; Koch, Susan; Rawson, Helen

    2017-12-01

    To discuss families' experiences of their interactions when a relative is admitted unexpectedly to an Australian intensive care unit. The overwhelming emotions associated with the unexpected admission of a relative to an intensive care unit are often due to the uncertainty surrounding the condition of their critically ill relative. There is limited in-depth understanding of the nature of uncertainty experienced by families in intensive care, and interventions perceived by families to minimise their uncertainty are not well documented. Furthermore, the interrelationships between factors, such as staff-family interactions and the intensive care unit environment, and its influence on families' uncertainty particularly in the context of the Australian healthcare system, are not well delineated. A grounded theory methodology was adopted for the study. Data were collected between 2009-2013, using in-depth interviews with 25 family members of 21 critically ill patients admitted to a metropolitan, tertiary-level intensive care unit in Australia. This paper describes the families experiences of heightened emotional vulnerability and uncertainty when a relative is admitted unexpectedly to the intensive care unit. Families uncertainty is directly influenced by their emotional state, the foreign environment and perceptions of being 'kept in the dark', as well as the interrelationships between these factors. Staff are offered an improved understanding of the barriers to families' ability to regain control, guided by a grounded theory of family resilience in the intensive care unit. The findings reveal in-depth understanding of families' uncertainty in intensive care. It suggests that intensive care unit staff need to focus clinical interventions on reducing factors that heighten their uncertainty, while optimising strategies that help alleviate it. Families are facilitated to move beyond feelings of helplessness and loss of control, and cope better with their situation. © 2017 John Wiley & Sons Ltd.

  10. Development of a method for fabricating polypropylene non-articulated dorsiflexion assist ankle foot orthoses with predetermined stiffness.

    PubMed

    Ramsey, Jason Allan

    2011-03-01

    A non-articulated plantarflexion resist ankle foot orthosis (AFO), commonly known as a posterior leaf spring AFO, is indicated for patients with motor impairment to the dorsiflexors. The AFO is often custom molded to a patient's lower limb anatomy and fabricated from polypropylene. There are no established guidelines for fabricating this type of AFO with predetermined stiffness of the ankle region for normal walking speeds. Therefore an AFO may not meet the biomechanical needs of the patient. Quantify the biomechanical ankle stiffness requirement for an individual with complete dorsiflexor impairment and develop a method for fabricating an AFO with ankle stiffness to meet that requirement. Experimental, bench research. The literature on sagittal biomechanics of non-pathological adults was reviewed to derive the stiffness of the ankle during loading response. Computer models of 144 AFOs were created with geometric variations to account for differences in human anthropometrics. Computer-based finite element analysis was employed to determine the stiffness and safety factor of the models. Stiffness of the AFOs ranged from 0.04 to 1.8 Nm/deg. This ample range is expected to account for the stiffness required for most adults with complete dorsiflexor impairment. At 5° deflection the factor of safety (ratio of strength to stress) ranged from 2.8 to 9.1. A computer program was generated that computes AFO stiffness from user-input variables of AFO geometry. The stiffness is compared to a theoretically appropriate stiffness based on the patient mass. The geometric variables can be modified until there is a close match, resulting in AFO design specification that is appropriate for the patient. Through validation on human subjects, this method may benefit patient outcomes in clinical practice by avoiding the current uncertainty surrounding AFO performance and reducing the labor and time involved in rectifying a custom AFO post-fabrication. This method provides an avenue for improving patient outcomes by avoiding the current uncertainty surrounding non-articulated plantarflexion resist ankle foot orthosis performance. The ability to quantify the biomechanical ankle stiffness requirement for an individual with complete dorsiflexor impairment provides insight into how other AFO types should be designed as well.

  11. Experiences of liver health related uncertainty and self-reported stress among people who inject drugs living with hepatitis C virus: a qualitative study.

    PubMed

    Goutzamanis, Stelliana; Doyle, Joseph S; Thompson, Alexander; Dietze, Paul; Hellard, Margaret; Higgs, Peter

    2018-04-02

    People who inject drugs (PWID) are most at risk of hepatitis C virus infection in Australia. The introduction of transient elastography (TE) (measuring hepatitis fibrosis) and direct acting antiviral medications will likely alter the experience of living with hepatitis C. We aimed to explore positive and negative influences on wellbeing and stress among PWID with hepatitis C. The Treatment and Prevention (TAP) study examines the feasibility of treating hepatitis C mono-infected PWID in community settings. Semi-structured interviews were conducted with 16 purposively recruited TAP participants. Participants were aware of their hepatitis C seropositive status and had received fibrosis assessment (measured by TE) prior to interview. Questions were open-ended, focusing on the impact of health status on wellbeing and self-reported stress. Interviews were voice recorded, transcribed verbatim and thematically analysed, guided by Mishel's (1988) theory of Uncertainty in Illness. In line with Mishel's theory of Uncertainty in Illness all participants reported hepatitis C-related uncertainty, particularly mis-information or a lack of knowledge surrounding liver health and the meaning of TE results. Those with greater fibrosis experienced an extra layer of prognostic uncertainty. Experiences of uncertainty were a key motivation to seek treatment, which was seen as a way to regain some stability in life. Treatment completion alleviated hepatitis C-related stress, and promoted feelings of empowerment and confidence in addressing other life challenges. TE scores seemingly provide some certainty. However, when paired with limited knowledge, particularly among people with severe fibrosis, TE may be a source of uncertainty and increased personal stress. This suggests the need for simple education programs and resources on liver health to minimise stress.

  12. Public Perceptions of Regulatory Costs, Their Uncertainty and Interindividual Distribution.

    PubMed

    Johnson, Branden B; Finkel, Adam M

    2016-06-01

    Public perceptions of both risks and regulatory costs shape rational regulatory choices. Despite decades of risk perception studies, this article is the first on regulatory cost perceptions. A survey of 744 U.S. residents probed: (1) How knowledgeable are laypeople about regulatory costs incurred to reduce risks? (2) Do laypeople see official estimates of cost and benefit (lives saved) as accurate? (3) (How) do preferences for hypothetical regulations change when mean-preserving spreads of uncertainty replace certain cost or benefit? and (4) (How) do preferences change when unequal interindividual distributions of hypothetical regulatory costs replace equal distributions? Respondents overestimated costs of regulatory compliance, while assuming agencies underestimate costs. Most assumed agency estimates of benefits are accurate; a third believed both cost and benefit estimates are accurate. Cost and benefit estimates presented without uncertainty were slightly preferred to those surrounded by "narrow uncertainty" (a range of costs or lives entirely within a personally-calibrated zone without clear acceptance or rejection of tradeoffs). Certain estimates were more preferred than "wide uncertainty" (a range of agency estimates extending beyond these personal bounds, thus posing a gamble between favored and unacceptable tradeoffs), particularly for costs as opposed to benefits (but even for costs a quarter of respondents preferred wide uncertainty to certainty). Agency-acknowledged uncertainty in general elicited mixed judgments of honesty and trustworthiness. People preferred egalitarian distributions of regulatory costs, despite skewed actual cost distributions, and preferred progressive cost distributions (the rich pay a greater than proportional share) to regressive ones. Efficient and socially responsive regulations require disclosure of much more information about regulatory costs and risks. © 2016 Society for Risk Analysis.

  13. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-03

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5 °C, across a wide range of scenarios.

  14. Extraction of Black Hole Shadows Using Ridge Filtering and the Circle Hough Transform

    NASA Astrophysics Data System (ADS)

    Hennessey, Ryan; Akiyama, Kazunori; Fish, Vincent

    2018-01-01

    Supermassive black holes are widely considered to reside at the center of most large galaxies. One of the foremost tasks in modern astronomy is to image the centers of local galaxies, such as that of Messier 87 (M87) and Sagittarius A* at the center of our own Milky Way, to gain the first glimpses of black holes and their surrounding structures. Using data obtained from the Event Horizon Telescope (EHT), a global collection of millimeter-wavelength telescopes designed to perform very long baseline interferometry, new imaging techniques will likely be able to yield images of these structures at fine enough resolutions to compare with the predictions of general relativity and give us more insight into the formation of black holes, their surrounding jets and accretion disks, and galaxies themselves. Techniques to extract features from these images are already being developed. In this work, we present a new method for measuring the size of the black hole shadow, a feature that encodes information about the black hole mass and spin, using ridge filtering and the circle Hough transform. Previous methods have succeeded in extracting the black hole shadow with an accuracy of about 10- 20%, but using this new technique we are able to measure the shadow size with even finer accuracy. Our work indicates that the EHT will be able to significantly reduce the uncertainty in the estimate of the mass of the supermassive black hole in M87.

  15. Measures of GCM Performance as Functions of Model Parameters Affecting Clouds and Radiation

    NASA Astrophysics Data System (ADS)

    Jackson, C.; Mu, Q.; Sen, M.; Stoffa, P.

    2002-05-01

    This abstract is one of three related presentations at this meeting dealing with several issues surrounding optimal parameter and uncertainty estimation of model predictions of climate. Uncertainty in model predictions of climate depends in part on the uncertainty produced by model approximations or parameterizations of unresolved physics. Evaluating these uncertainties is computationally expensive because one needs to evaluate how arbitrary choices for any given combination of model parameters affects model performance. Because the computational effort grows exponentially with the number of parameters being investigated, it is important to choose parameters carefully. Evaluating whether a parameter is worth investigating depends on two considerations: 1) does reasonable choices of parameter values produce a large range in model response relative to observational uncertainty? and 2) does the model response depend non-linearly on various combinations of model parameters? We have decided to narrow our attention to selecting parameters that affect clouds and radiation, as it is likely that these parameters will dominate uncertainties in model predictions of future climate. We present preliminary results of ~20 to 30 AMIPII style climate model integrations using NCAR's CCM3.10 that show model performance as functions of individual parameters controlling 1) critical relative humidity for cloud formation (RHMIN), and 2) boundary layer critical Richardson number (RICR). We also explore various definitions of model performance that include some or all observational data sources (surface air temperature and pressure, meridional and zonal winds, clouds, long and short-wave cloud forcings, etc...) and evaluate in a few select cases whether the model's response depends non-linearly on the parameter values we have selected.

  16. Adaptation of water resource systems to an uncertain future

    NASA Astrophysics Data System (ADS)

    Walsh, Claire L.; Blenkinsop, Stephen; Fowler, Hayley J.; Burton, Aidan; Dawson, Richard J.; Glenis, Vassilis; Manning, Lucy J.; Jahanshahi, Golnaz; Kilsby, Chris G.

    2016-05-01

    Globally, water resources management faces significant challenges from changing climate and growing populations. At local scales, the information provided by climate models is insufficient to support the water sector in making future adaptation decisions. Furthermore, projections of change in local water resources are wrought with uncertainties surrounding natural variability, future greenhouse gas emissions, model structure, population growth, and water consumption habits. To analyse the magnitude of these uncertainties, and their implications for local-scale water resource planning, we present a top-down approach for testing climate change adaptation options using probabilistic climate scenarios and demand projections. An integrated modelling framework is developed which implements a new, gridded spatial weather generator, coupled with a rainfall-runoff model and water resource management simulation model. We use this to provide projections of the number of days and associated uncertainty that will require implementation of demand saving measures such as hose pipe bans and drought orders. Results, which are demonstrated for the Thames Basin, UK, indicate existing water supplies are sensitive to a changing climate and an increasing population, and that the frequency of severe demand saving measures are projected to increase. Considering both climate projections and population growth, the median number of drought order occurrences may increase 5-fold by the 2050s. The effectiveness of a range of demand management and supply options have been tested and shown to provide significant benefits in terms of reducing the number of demand saving days. A decrease in per capita demand of 3.75 % reduces the median frequency of drought order measures by 50 % by the 2020s. We found that increased supply arising from various adaptation options may compensate for increasingly variable flows; however, without reductions in overall demand for water resources such options will be insufficient on their own to adapt to uncertainties in the projected changes in climate and population. For example, a 30 % reduction in overall demand by 2050 has a greater impact on reducing the frequency of drought orders than any of the individual or combinations of supply options; hence, a portfolio of measures is required.

  17. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  18. To what extent can ecosystem services motivate protecting biodiversity?

    PubMed

    Dee, Laura E; De Lara, Michel; Costello, Christopher; Gaines, Steven D

    2017-08-01

    Society increasingly focuses on managing nature for the services it provides people rather than for the existence of particular species. How much biodiversity protection would result from this modified focus? Although biodiversity contributes to ecosystem services, the details of which species are critical, and whether they will go functionally extinct in the future, are fraught with uncertainty. Explicitly considering this uncertainty, we develop an analytical framework to determine how much biodiversity protection would arise solely from optimising net value from an ecosystem service. Using stochastic dynamic programming, we find that protecting a threshold number of species is optimal, and uncertainty surrounding how biodiversity produces services makes it optimal to protect more species than are presumed critical. We define conditions under which the economically optimal protection strategy is to protect all species, no species, and cases in between. We show how the optimal number of species to protect depends upon different relationships between species and services, including considering multiple services. Our analysis provides simple criteria to evaluate when managing for particular ecosystem services could warrant protecting all species, given uncertainty. Evaluating this criterion with empirical estimates from different ecosystems suggests that optimising some services will be more likely to protect most species than others. © 2017 John Wiley & Sons Ltd/CNRS.

  19. Materiality matters: Blurred boundaries and the domestication of functional foods

    PubMed Central

    Weiner, Kate; Will, Catherine

    2015-01-01

    Previous scholarship on novel foods, including functional foods, has suggested that they are difficult to categorise for both regulators and users. It is argued that they blur the boundary between ‘food' and ‘drug' and that uncertainties about the products create ‘experimental' or ‘restless' approaches to consumption. We investigate these uncertainties drawing on data about the use of functional foods containing phytosterols, which are licensed for sale in the EU for people wishing to reduce their cholesterol. We start from an interest in the products as material objects and their incorporation into everyday practices. We consider the scripts encoded in the physical form of the products through their regulation, production and packaging and find that these scripts shape but do not determine their use. The domestication of phytosterols involves bundling the products together with other objects (pills, supplements, foodstuffs). Considering their incorporation into different systems of objects offers new understandings of the products as foods or drugs. In their accounts of their practices, consumers appear to be relatively untroubled by uncertainties about the character of the products. We conclude that attending to materials and practices offers a productive way to open up and interrogate the idea of categorical uncertainties surrounding new food products. PMID:26157471

  20. Use of Inverse-Modeling Methods to Improve Ground-Water-Model Calibration and Evaluate Model-Prediction Uncertainty, Camp Edwards, Cape Cod, Massachusetts

    USGS Publications Warehouse

    Walter, Donald A.; LeBlanc, Denis R.

    2008-01-01

    Historical weapons testing and disposal activities at Camp Edwards, which is located on the Massachusetts Military Reservation, western Cape Cod, have resulted in the release of contaminants into an underlying sand and gravel aquifer that is the sole source of potable water to surrounding communities. Ground-water models have been used at the site to simulate advective transport in the aquifer in support of field investigations. Reasonable models developed by different groups and calibrated by trial and error often yield different predictions of advective transport, and the predictions lack quantitative measures of uncertainty. A recently (2004) developed regional model of western Cape Cod, modified to include the sensitivity and parameter-estimation capabilities of MODFLOW-2000, was used in this report to evaluate the utility of inverse (statistical) methods to (1) improve model calibration and (2) assess model-prediction uncertainty. Simulated heads and flows were most sensitive to recharge and to the horizontal hydraulic conductivity of the Buzzards Bay and Sandwich Moraines and the Buzzards Bay and northern parts of the Mashpee outwash plains. Conversely, simulated heads and flows were much less sensitive to vertical hydraulic conductivity. Parameter estimation (inverse calibration) improved the match to observed heads and flows; the absolute mean residual for heads improved by 0.32 feet and the absolute mean residual for streamflows improved by about 0.2 cubic feet per second. Advective-transport predictions in Camp Edwards generally were most sensitive to the parameters with the highest precision (lowest coefficients of variation), indicating that the numerical model is adequate for evaluating prediction uncertainties in and around Camp Edwards. The incorporation of an advective-transport observation, representing the leading edge of a contaminant plume that had been difficult to match by using trial-and-error calibration, improved the match between an observed and simulated plume path; however, a modified representation of local geology was needed to simultaneously maintain a reasonable calibration to heads and flows and to the plume path. Advective-transport uncertainties were expressed as about 68-, 95-, and 99-percent confidence intervals on three dimensional simulated particle positions. The confidence intervals can be graphically represented as ellipses around individual particle positions in the X-Y (geographic) plane and in the X-Z or Y-Z (vertical) planes. The merging of individual ellipses allows uncertainties on forward particle tracks to be displayed in map or cross-sectional view as a cone of uncertainty around a simulated particle path; uncertainties on reverse particle-track endpoints - representing simulated recharge locations - can be geographically displayed as areas at the water table around the discrete particle endpoints. This information gives decisionmakers insight into the level of confidence they can have in particle-tracking results and can assist them in the efficient use of available field resources.

  1. Supplement: “The Rate of Binary Black Hole Mergers Inferred from Advanced LIGO Observations Surrounding GW150914” (2016, ApJL, 833, L1)

    NASA Astrophysics Data System (ADS)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D’Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O’Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O’Reilly, B.; O’Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wesels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration

    2016-12-01

    This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc‑3 yr‑1. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.

  2. Values Guide Us in Times of Uncertainty: DACA and Graduate Medical Education.

    PubMed

    Poll-Hunter, Norma I; Young, Geoffrey H; Shick, Matthew

    2017-11-01

    With a new administration and Congress, there is uncertainty surrounding the future of the Deferred Action for Childhood Arrivals (DACA) program. In light of this uncertainty, medical schools have tried to better understand how they can support trainees with DACA. In their article in this issue, Nakae and colleagues describe the issues often encountered by medical students with DACA as they prepare for residency and by the program directors who receive their applications. They offer recommendations for best practices to support these trainees. The authors of this Invited Commentary expand on these important considerations, based on their experiences at a national level. They argue that the core values in academic medicine should drive decision making, the student voice is critical, teamwork is essential, and wellness deserves attention. Academic medicine is part of a larger movement with partners across the health professions and higher education focused on advancing the values of access and opportunity for all. The authors of this Invited Commentary argue that remaining steadfast and committed to the core values in medicine will allow the academic medicine community to successfully navigate these uncertain times.

  3. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses.

    PubMed

    Fuller, Robert William; Wong, Tony E; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.

  4. Generally Recognized as Safe: Uncertainty Surrounding E-Cigarette Flavoring Safety.

    PubMed

    Sears, Clara G; Hart, Joy L; Walker, Kandi L; Robertson, Rose Marie

    2017-10-23

    Despite scientific uncertainty regarding the relative safety of inhaling e-cigarette aerosol and flavorings, some consumers regard the U.S. Food and Drug Administration's "generally recognized as safe" (GRAS) designation as evidence of flavoring safety. In this study, we assessed how college students' perceptions of e-cigarette flavoring safety are related to understanding of the GRAS designation. During spring 2017, an online questionnaire was administered to college students. Chi-square p -values and multivariable logistic regression were employed to compare perceptions among participants considering e-cigarette flavorings as safe and those considering e-cigarette flavorings to be unsafe. The total sample size was 567 participants. Only 22% knew that GRAS designation meant that a product is safe to ingest, not inhale, inject, or use topically. Of participants who considered flavorings to be GRAS, the majority recognized that the designation meant a product is safe to ingest but also considered it safe to inhale. Although scientific uncertainty on the overall safety of flavorings in e-cigarettes remains, health messaging can educate the public about the GRAS designation and its irrelevance to e-cigarette safety.

  5. Interventions to Manage Uncertainty and Fear of Recurrence in Female Breast Cancer Survivors: A Review of the Literature.

    PubMed

    Dawson, Gretchen; Madsen, Lydia T; Dains, Joyce E

    2016-12-01

    Fear of cancer recurrence (FCR) is one of the largest unmet needs in the breast cancer survivor population. This review addresses this unmet need with the question. The purpose of this article is to better understand potential interventions to manage FCR when caring for breast cancer survivors. Databases used were PubMed, CINAHL®, Google Scholar, EMBASE, and Scopus. Articles published in English from 2009-2014 with female breast cancer survivors and interventions that address FCR as an endpoint or outcome measure or objectively illustrate an improvement in FCR were included. One hundred ninety-eight articles were initially identified in this literature review search. Upon detailed review of content for relevance, seven articles met criteria to be included in this review. This literature review provided current evidence of published interventions to manage uncertainty in the female breast cancer survivor population, as well as future research recommendations. Interventions surrounding being mindful, managing uncertainty, having more effective patient-provider communication, and handling stress through counseling are options for managing FCR.

  6. Surrounding land cover types as predictors of palustrine wetland vegetation quality in conterminous USA

    USGS Publications Warehouse

    Stapanian, Martin A.; Gara, Brian; Schumacher, William

    2018-01-01

    The loss of wetland habitats and their often-unique biological communities is a major environmental concern. We examined vegetation data obtained from 380 wetlands sampled in a statistical survey of wetlands in the USA. Our goal was to identify which surrounding land cover types best predict two indices of vegetation quality in wetlands at the regional scale. We considered palustrine wetlands in four regions (Coastal Plains, North Central East, Interior Plains, and West) in which the dominant vegetation was emergent, forested, or scrub-shrub. For each wetland, we calculated weighted proportions of eight land cover types surrounding the area in which vegetation was assessed, in four zones radiating from the edge of the assessment area to 2 km. Using Akaike's Information Criterion, we determined the best 1-, 2- and 3-predictor models of the two indices, using the weighted proportions of the land cover types as potential predictors. Mean values of the two indices were generally higher in the North Central East and Coastal Plains than the other regions for forested and emergent wetlands. In nearly all cases, the best predictors of the indices were not the dominant surrounding land cover types. Overall, proportions of forest (positive effect) and agriculture (negative effect) surrounding the assessment area were the best predictors of the two indices. One or both of these variables were included as predictors in 65 of the 72 models supported by the data. Wetlands surrounding the assessment area had a positive effect on the indices, and ranked third (33%) among the predictors included in supported models. Development had a negative effect on the indices and was included in only 28% of supported models. These results can be used to develop regional management plans for wetlands, such as creating forest buffers around wetlands, or to conserve zones between wetlands to increase habitat connectivity.

  7. Expanding coverage via tax credits: trade-offs and outcomes.

    PubMed

    Pauly, M; Herring, B

    2001-01-01

    In this paper we discuss various options for using refundable tax credits to reduce the number of uninsured persons. The effect of tax credits on the number of uninsured depends on the form of the credit scheme adopted. Moreover, since large subsidies for private insurance directed to low-income persons have never been implemented, there is considerable uncertainty about the effect of various tax credit proposals. We find that small credits will do little to reduce the number of uninsured but that credits covering about half of the premium for a benchmark policy might have a significant effect, especially if they take a fixed-dollar form and can be used for policies with few restrictions. Finally, we discuss the normative issues surrounding the "costs" of these credits schemes, and the policy issues raised by the uncertainty of the effects.

  8. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections Analyzed with Climate Signal Maps: Seasonal and Extreme Precipitation for Germany, Atmosphere (Basel)., 6(5), 677-698, doi:10.3390/atmos6050677, 2015.

  9. Uncertainty in mixing models: a blessing in disguise?

    NASA Astrophysics Data System (ADS)

    Delsman, J. R.; Oude Essink, G. H. P.

    2012-04-01

    Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.

  10. Beam-specific planning volumes for scattered-proton lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Flampouri, S.; Hoppe, B. S.; Slopsema, R. L.; Li, Z.

    2014-08-01

    This work describes the clinical implementation of a beam-specific planning treatment volume (bsPTV) calculation for lung cancer proton therapy and its integration into the treatment planning process. Uncertainties incorporated in the calculation of the bsPTV included setup errors, machine delivery variability, breathing effects, inherent proton range uncertainties and combinations of the above. Margins were added for translational and rotational setup errors and breathing motion variability during the course of treatment as well as for their effect on proton range of each treatment field. The effect of breathing motion and deformation on the proton range was calculated from 4D computed tomography data. Range uncertainties were considered taking into account the individual voxel HU uncertainty along each proton beamlet. Beam-specific treatment volumes generated for 12 patients were used: a) as planning targets, b) for routine plan evaluation, c) to aid beam angle selection and d) to create beam-specific margins for organs at risk to insure sparing. The alternative planning technique based on the bsPTVs produced similar target coverage as the conventional proton plans while better sparing the surrounding tissues. Conventional proton plans were evaluated by comparing the dose distributions per beam with the corresponding bsPTV. The bsPTV volume as a function of beam angle revealed some unexpected sources of uncertainty and could help the planner choose more robust beams. Beam-specific planning volume for the spinal cord was used for dose distribution shaping to ensure organ sparing laterally and distally to the beam.

  11. Early-stage valuation of medical devices: the role of developmental uncertainty.

    PubMed

    Girling, Alan; Young, Terry; Brown, Celia; Lilford, Richard

    2010-08-01

    At the concept stage, many uncertainties surround the commercial viability of a new medical device. These include the ultimate functionality of the device, the cost of producing it and whether, and at what price, it can be sold to a health-care provider (HCP). Simple assessments of value can be made by estimating such unknowns, but the levels of uncertainty may mean that their operational value for investment decisions is unclear. However, many decisions taken at the concept stage are reversible and will be reconsidered later before the product is brought to market. This flexibility can be exploited to enhance early-stage valuations. To develop a framework for valuing a new medical device at the concept stage that balances benefit to the HCP against commercial costs. This is done within a simplified stage-gated model of the development cycle for new products. The approach is intended to complement existing proposals for the evaluation of the commercial headroom available to new medical products. A model based on two decision gates can lead to lower bounds (underestimates) for product value that can serve to support a decision to develop the product. Quantifiable uncertainty that can be resolved before the device is brought to market will generally enhance early-stage valuations of the device, and this remains true even when some components of uncertainty cannot be fully described. Clinical trials and other evidence-gathering activities undertaken as part of the development process can contribute to early-stage estimates of value.

  12. The potential for meta-analysis to support decision analysis in ecology.

    PubMed

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Deep Uncertainty Surrounding Coastal Flood Risk Projections: A Case Study for New Orleans

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Keller, Klaus

    2017-10-01

    Future sea-level rise drives severe risks for many coastal communities. Strategies to manage these risks hinge on a sound characterization of the uncertainties. For example, recent studies suggest that large fractions of the Antarctic ice sheet (AIS) may rapidly disintegrate in response to rising global temperatures, leading to potentially several meters of sea-level rise during the next few centuries. It is deeply uncertain, for example, whether such an AIS disintegration will be triggered, how much this would increase sea-level rise, whether extreme storm surges intensify in a warming climate, or which emissions pathway future societies will choose. Here, we assess the impacts of these deep uncertainties on projected flooding probabilities for a levee ring in New Orleans, LA. We use 18 scenarios, presenting probabilistic projections within each one, to sample key deeply uncertain future projections of sea-level rise, radiative forcing pathways, storm surge characterization, and contributions from rapid AIS mass loss. The implications of these deep uncertainties for projected flood risk are thus characterized by a set of 18 probability distribution functions. We use a global sensitivity analysis to assess which mechanisms contribute to uncertainty in projected flood risk over the course of a 50-year design life. In line with previous work, we find that the uncertain storm surge drives the most substantial risk, followed by general AIS dynamics, in our simple model for future flood risk for New Orleans.

  14. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less

  15. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  16. Lessons Learned from the Soviet Withdrawal from Afghanistan: Examples for U.S. Policy Concerning Central Asia and Afghanistan after 2014

    DTIC Science & Technology

    2014-12-01

    7. 248 A great deal of uncertainty surrounds the incident as the government suppressed media reporting and blocked all forms of communication ... A Post-2014 Strategy for Central Asia. Carlisle, PA: Army War College , 2012. http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html...Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the

  17. The swine flu vaccine, public attitudes, and researcher interpretations: a systematic review of qualitative research.

    PubMed

    Carlsen, Benedicte; Glenton, Claire

    2016-06-24

    During pandemics, health authorities may be uncertain about the spread and severity of the disease and the effectiveness and safety of available interventions. This was the case during the swine flu (H1N1) pandemic of 2009-2010, and governments were forced to make decisions despite these uncertainties. While many countries chose to implement wide scale vaccination programmes, few accomplished their vaccination goals. Many research studies aiming to explore barriers and facilitators to vaccine uptake have been conducted in the aftermath of the pandemic, including several qualitative studies. 1. To explore public attitudes to the swine flu vaccine in different countries through a review of qualitative primary studies. 2. To describe and discuss the implications drawn by the primary study authors. Systematic review of qualitative research studies, using a broadly comparative cross case-study approach. Study quality was appraised using an adaptation of the Critical Appraisal Skills Programme (CASP) quality assessment tool. The review indicates that the public had varying opinions about disease risk and prevalence and had concerns about vaccine safety. Most primary study authors concluded that participants were uninformed, and that more information about the disease and the vaccine would have led to an increase in vaccine uptake. We find these conclusions problematic. We suggest instead that people's questions and concerns were legitimate given the uncertainties of the situation at the time and the fact that the authorities did not have the necessary information to convince the public. Our quality assessment of the included studies points to a lack of reflexivity and a lack of information about study context. We suggest that these study weaknesses are tied to primary study authors' lack of acknowledgement of the uncertainties surrounding the disease and the vaccine. While primary study authors suggest that authorities could increase vaccine uptake through increased information, we suggest instead that health authorities should be more transparent in their information and decision-making processes in future pandemic situations.

  18. A review of uncertainty in in situ measurements and data sets of sea surface temperature

    NASA Astrophysics Data System (ADS)

    Kennedy, John J.

    2014-03-01

    Archives of in situ sea surface temperature (SST) measurements extend back more than 160 years. Quality of the measurements is variable, and the area of the oceans they sample is limited, especially early in the record and during the two world wars. Measurements of SST and the gridded data sets that are based on them are used in many applications so understanding and estimating the uncertainties are vital. The aim of this review is to give an overview of the various components that contribute to the overall uncertainty of SST measurements made in situ and of the data sets that are derived from them. In doing so, it also aims to identify current gaps in understanding. Uncertainties arise at the level of individual measurements with both systematic and random effects and, although these have been extensively studied, refinement of the error models continues. Recent improvements have been made in the understanding of the pervasive systematic errors that affect the assessment of long-term trends and variability. However, the adjustments applied to minimize these systematic errors are uncertain and these uncertainties are higher before the 1970s and particularly large in the period surrounding the Second World War owing to a lack of reliable metadata. The uncertainties associated with the choice of statistical methods used to create globally complete SST data sets have been explored using different analysis techniques, but they do not incorporate the latest understanding of measurement errors, and they want for a fair benchmark against which their skill can be objectively assessed. These problems can be addressed by the creation of new end-to-end SST analyses and by the recovery and digitization of data and metadata from ship log books and other contemporary literature.

  19. “Wrong, but Useful”: Negotiating Uncertainty in Infectious Disease Modelling

    PubMed Central

    Christley, Robert M.; Mort, Maggie; Wynne, Brian; Wastling, Jonathan M.; Heathwaite, A. Louise; Pickup, Roger; Austin, Zoë; Latham, Sophia M.

    2013-01-01

    For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be ‘used’. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to ‘interpretive flexibility’. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts. PMID:24146851

  20. NASA MEaSUREs Combined ASTER and MODIS Emissivity over Land (CAMEL) Uncertainty Estimation

    NASA Astrophysics Data System (ADS)

    Feltz, M.; Borbas, E. E.; Knuteson, R. O.; Hulley, G. C.; Hook, S. J.

    2016-12-01

    Under the NASA MEASUREs project a new global, land surface emissivity database is being made available as part of the Unified and Coherent Land Surface Temperature and Emissivity Earth System Data Record. This new CAMEL emissivity database is created by the merging of the MODIS baseline-fit emissivity database (UWIREMIS) developed at the University of Wisconsin-Madison and the ASTER Global Emissivity Dataset v4 produced at the Jet Propulsion Labratory. The combined CAMEL product leverages the ability of ASTER's 5 bands to more accurately resolve the TIR (8-12 micron) region and the ability of UWIREMIS to provide information throughout the 3.6-12 micron IR region. It will be made available for 2000 through 2017 at monthly mean, 5 km resolution for 13 bands within the 3.6-14.3 micron region, and will also be extended to 417 infrared spectral channels using a principal component regression approach. Uncertainty estimates of the CAMEL will be provided that combine temporal, spatial, and algorithm variability as part of a total uncertainty estimate for the emissivity product. The spatial and temporal uncertainties are calculated as the standard deviation of the surrounding 5x5 pixels and 3 neighboring months respectively while the algorithm uncertainty is calculated using a measure of the difference between the two CAMEL emissivity inputs—the ASTER GED and MODIS baseline-fit products. This work describes these uncertainty estimation methods in detail and shows first results. Global, monthly results for different seasons are shown as well as case study examples at locations with different land surface types. Comparisons of the case studies to both lab values and an independent emissivity climatology derived from IASI measurements (Dan Zhou et al., IEEE Trans., 2011) are included.

  1. Habitat edges have weak effects on duck nest survival at local spatial scales

    USGS Publications Warehouse

    Raquel, Amelia J; Ringelman, Kevin M.; Ackerman, Joshua T.; Eadie, John M.

    2015-01-01

    Edge effects on nesting success have been documented in breeding birds in a variety of contexts, but there is still uncertainty in how edge type and spatial scale determine the magnitude and detectability of edge effects. Habitat edges are often viewed as predator corridors that surround or penetrate core habitat and increase the risk of predation for nearby nests. We studied the effects of three different types of potential predator corridors (main perimeter roads, field boundaries, and ATV trails within fields) on waterfowl nest survival in California. We measured the distance from duck nests to the nearest edge of each type, and used distance as a covariate in a logistic exposure analysis of nest survival. We found only weak evidence for edge effects due to predation. The best supported model of nest survival included all three distance categories, and while all coefficient estimates were positive (indicating that survival increased with distance from edge), 85% coefficient confidence intervals approached or bounded zero indicating an overall weak effect of habitat edges on nest success. We suggest that given the configuration of edges at our site, there may be few areas far enough from hard edges to be considered ‘core’ habitat, making edge effects on nest survival particularly difficult to detect.

  2. Low atmospheric CO(2) levels during the Permo- Carboniferous glaciation inferred from fossil lycopsids.

    PubMed

    Beerling, D J

    2002-10-01

    Earth history was punctuated during the Permo-Carboniferous [300-250 million years (Myr) ago] by the longest and most severe glaciation of the entire Phanerozoic Eon. But significant uncertainty surrounds the concentration of CO(2) in the atmosphere through this time interval and therefore its role in the evolution of this major prePleistocene glaciation. Here, I derive 24 Late Paleozoic CO(2) estimates from the fossil cuticle record of arborsecent lycopsids of the equatorial Carboniferous and Permian swamp communities. Quantitative calibration of Late Carboniferous (330-300 Myr ago) and Permian (270-260 Myr ago) lycopsid stomatal indices yield average atmospheric CO(2) concentrations of 344 ppm and 313 ppm, respectively. The reconstructions show a high degree of self-consistency and a degree of precision an order of magnitude greater than other approaches. Low CO(2) levels during the Permo-Carboniferous glaciation are in agreement with glaciological evidence for the presence of continental ice and coupled models of climate and ice-sheet growth on Pangea. Moreover, the Permian data indicate atmospheric CO(2) levels were low 260 Myr ago, by which time continental deglaciation was already underway. Positive biotic feedbacks on climate, and geotectonic events, therefore are implicated as mechanisms underlying deglaciation.

  3. Low atmospheric CO2 levels during the Permo- Carboniferous glaciation inferred from fossil lycopsids

    PubMed Central

    Beerling, D. J.

    2002-01-01

    Earth history was punctuated during the Permo-Carboniferous [300–250 million years (Myr) ago] by the longest and most severe glaciation of the entire Phanerozoic Eon. But significant uncertainty surrounds the concentration of CO2 in the atmosphere through this time interval and therefore its role in the evolution of this major prePleistocene glaciation. Here, I derive 24 Late Paleozoic CO2 estimates from the fossil cuticle record of arborsecent lycopsids of the equatorial Carboniferous and Permian swamp communities. Quantitative calibration of Late Carboniferous (330–300 Myr ago) and Permian (270–260 Myr ago) lycopsid stomatal indices yield average atmospheric CO2 concentrations of 344 ppm and 313 ppm, respectively. The reconstructions show a high degree of self-consistency and a degree of precision an order of magnitude greater than other approaches. Low CO2 levels during the Permo-Carboniferous glaciation are in agreement with glaciological evidence for the presence of continental ice and coupled models of climate and ice-sheet growth on Pangea. Moreover, the Permian data indicate atmospheric CO2 levels were low 260 Myr ago, by which time continental deglaciation was already underway. Positive biotic feedbacks on climate, and geotectonic events, therefore are implicated as mechanisms underlying deglaciation. PMID:12235372

  4. Cell-free fetal DNA testing: A pilot study of obstetric healthcare provider attitudes towards clinical implementation

    PubMed Central

    Sayres, Lauren; Allyse, Megan; Norton, Mary; Cho, Mildred

    2011-01-01

    Objective To provide a preliminary assessment of obstetric healthcare provider opinions surrounding implementation of cell-free fetal DNA testing. Methods A 37-question pilot survey was used to address questions around the translation and use of non-invasive prenatal testing using cell-free fetal DNA. The survey was distributed and collected at a Continuing Medical Education course on obstetrics and gynecology. Results Of 62 survey respondents, 73% are female and 87% hold MD/DO degrees. Respondents generally agree that patients want prenatal diagnostic information to help make decisions about a pregnancy and that cell-free fetal DNA testing will encourage the testing of more patients for more conditions. However, there is an overall lack of knowledge or conviction about using this technology. Genetic counseling and professional society approval are deemed important to implementation whereas the possibility of direct-to-consumer testing and government regulation produce mixed responses. Respondents indicate that they are more likely to offer cell-free fetal DNA testing for chromosomal abnormalities and single-gene disorders, but are cautious with respect to determination of sex and behavioral or late-onset conditions. Conclusion Preliminary assessment indicates uncertainty among obstetric providers about the details of implementing cell-free fetal DNA testing and suggests expanded research on perspectives of this stakeholder group. PMID:21793012

  5. Implications of Uncertainty in Fossil Fuel Emissions for Terrestrial Ecosystem Modeling

    NASA Astrophysics Data System (ADS)

    King, A. W.; Ricciuto, D. M.; Mao, J.; Andres, R. J.

    2017-12-01

    Given observations of the increase in atmospheric CO2, estimates of anthropogenic emissions and models of oceanic CO2 uptake, one can estimate net global CO2 exchange between the atmosphere and terrestrial ecosystems as the residual of the balanced global carbon budget. Estimates from the Global Carbon Project 2016 show that terrestrial ecosystems are a growing sink for atmospheric CO2 (averaging 2.12 Gt C y-1 for the period 1959-2015 with a growth rate of 0.03 Gt C y-1 per year) but with considerable year-to-year variability (standard deviation of 1.07 Gt C y-1). Within the uncertainty of the observations, emissions estimates and ocean modeling, this residual calculation is a robust estimate of a global terrestrial sink for CO2. A task of terrestrial ecosystem science is to explain the trend and variability in this estimate. However, "within the uncertainty" is an important caveat. The uncertainty (2σ; 95% confidence interval) in fossil fuel emissions is 8.4% (±0.8 Gt C in 2015). Combined with uncertainty in other carbon budget components, the 2σ uncertainty surrounding the global net terrestrial ecosystem CO2 exchange is ±1.6 Gt C y-1. Ignoring the uncertainty, the estimate of a general terrestrial sink includes 2 years (1987 and 1998) in which terrestrial ecosystems are a small source of CO2 to the atmosphere. However, with 2σ uncertainty, terrestrial ecosystems may have been a source in as many as 18 years. We examine how well global terrestrial biosphere models simulate the trend and interannual variability of the global-budget estimate of the terrestrial sink within the context of this uncertainty (e.g., which models fall outside the 2σ uncertainty and in what years). Models are generally capable of reproducing the trend in net terrestrial exchange, but are less able to capture interannual variability and often fall outside the 2σ uncertainty. The trend in the residual carbon budget estimate is primarily associated with the increase in atmospheric CO2, while interannual variation is related to variations in global land-surface temperature with weaker sinks in warmer years. We examine whether these relationships are reproduced in models. Their absence might explain weaknesses in model simulations or in the reconstruction of historical climate used as drivers in model intercomparison projects (MIPs).

  6. SUPPLEMENT: “THE RATE OF BINARY BLACK HOLE MERGERS INFERRED FROM ADVANCED LIGO OBSERVATIONS SURROUNDING GW150914” (2016, ApJL, 833, L1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, B. P.; Abbott, R.; Abernathy, M. R.

    This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc{sup −3} yr{sup −1}. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty,more » and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.« less

  7. Measurement uncertainty budget of an interferometric flow velocity sensor

    NASA Astrophysics Data System (ADS)

    Bermuske, Mike; Büttner, Lars; Czarske, Jürgen

    2017-06-01

    Flow rate measurements are a common topic for process monitoring in chemical engineering and food industry. To achieve the requested low uncertainties of 0:1% for flow rate measurements, a precise measurement of the shear layers of such flows is necessary. The Laser Doppler Velocimeter (LDV) is an established method for measuring local flow velocities. For exact estimation of the flow rate, the flow profile in the shear layer is of importance. For standard LDV the axial resolution and therefore the number of measurement points in the shear layer is defined by the length of the measurement volume. A decrease of this length is accompanied by a larger fringe distance variation along the measurement axis which results in a rise of the measurement uncertainty for the flow velocity (uncertainty relation between spatial resolution and velocity uncertainty). As a unique advantage, the laser Doppler profile sensor (LDV-PS) overcomes this problem by using two fan-like fringe systems to obtain the position of the measured particles along the measurement axis and therefore achieve a high spatial resolution while it still offers a low velocity uncertainty. With this technique, the flow rate can be estimated with one order of magnitude lower uncertainty, down to 0:05% statistical uncertainty.1 And flow profiles especially in film flows can be measured more accurately. The problem for this technique is, in contrast to laboratory setups where the system is quite stable, that for industrial applications the sensor needs a reliable and robust traceability to the SI units, meter and second. Small deviations in the calibration can, because of the highly position depending calibration function, cause large systematic errors in the measurement result. Therefore, a simple, stable and accurate tool is needed, that can easily be used in industrial surroundings to check or recalibrate the sensor. In this work, different calibration methods are presented and their influences to the measurement uncertainty budget of the sensor is discussed. Finally, generated measurement results for the film flow of an impinging jet cleaning experiment are presented.

  8. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    NASA Astrophysics Data System (ADS)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  9. Optical fibers and Fluorosensors having improved power efficiency and methods of producing same

    NASA Technical Reports Server (NTRS)

    Egalon, Claudio O. (Inventor); Rogowski, Robert S. (Inventor)

    1993-01-01

    Optical fibers may have applications including fluorosensors which sense the concentration of an analyte. Like communication fibers, these fluorosensors are modeled using a weakly guiding approximation which is only effective when the difference between the respective refractive indices of the fiber core and surrounding cladding are minimal. An optical fiber fluorosensor is provided having a portion of a fiber core which is surrounded by an active cladding which is permeable by the analyte to be sensed and containing substances which emit light waves upon excitation. A remaining portion of the fiber core is surrounded by a guide cladding which guides these light waves to a sensor which detects the intensity of waves, which is a function of the analyte concentration. Contrary to conventional weakly guiding principles, the difference between the respective indices of refraction of the fiber core is surrounded by an active cladding which is thin enough such that its index of refraction is effectively that of the surrounding atmosphere, thereby the atmosphere guides the injective indices of the fiber core and the cladding results in an unexpected increase in the power efficiency of the fiber core.

  10. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  11. Population growth of Yellowstone grizzly bears: Uncertainty and future monitoring

    USGS Publications Warehouse

    Harris, R.B.; White, Gary C.; Schwartz, C.C.; Haroldson, M.A.

    2007-01-01

    Grizzly bears (Ursus arctos) in the Greater Yellowstone Ecosystem of the US Rocky Mountains have recently increased in numbers, but remain vulnerable due to isolation from other populations and predicted reductions in favored food resources. Harris et al. (2006) projected how this population might fare in the future under alternative survival rates, and in doing so estimated the rate of population growth, 1983–2002. We address issues that remain from that earlier work: (1) the degree of uncertainty surrounding our estimates of the rate of population change (λ); (2) the effect of correlation among demographic parameters on these estimates; and (3) how a future monitoring system using counts of females accompanied by cubs might usefully differentiate between short-term, expected, and inconsequential fluctuations versus a true change in system state. We used Monte Carlo re-sampling of beta distributions derived from the demographic parameters used by Harris et al. (2006) to derive distributions of λ during 1983–2002 given our sampling uncertainty. Approximate 95% confidence intervals were 0.972–1.096 (assuming females with unresolved fates died) and 1.008–1.115 (with unresolved females censored at last contact). We used well-supported models of Haroldson et al. (2006) and Schwartz et al. (2006a,b,c) to assess the strength of correlations among demographic processes and the effect of omitting them in projection models. Incorporating correlations among demographic parameters yielded point estimates of λ that were nearly identical to those from the earlier model that omitted correlations, but yielded wider confidence intervals surrounding λ. Finally, we suggest that fitting linear and quadratic curves to the trend suggested by the estimated number of females with cubs in the ecosystem, and using AICc model weights to infer population sizes and λ provides an objective means to monitoring approximate population trajectories in addition to demographic analysis.

  12. Representing uncertainty in objective functions: extension to include the influence of serial correlation

    NASA Astrophysics Data System (ADS)

    Croke, B. F.

    2008-12-01

    The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.

  13. A Framework for Modeling Emerging Diseases to Inform Management

    PubMed Central

    Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan H.C.

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge. PMID:27983501

  14. Effects of Uncertainty in TRMM Precipitation Radar Path Integrated Attenuation on Interannual Variations of Tropical Oceanic Rainfall

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Fitzjarrald, Dan E.; Kummerow, Christian D.; Arnold, James E. (Technical Monitor)

    2002-01-01

    Considerable uncertainty surrounds the issue of whether precipitation over the tropical oceans (30 deg N/S) systematically changes with interannual sea-surface temperature (SST) anomalies that accompany El Nino (warm) and La Nina (cold) events. Time series of rainfall estimates from the Tropical Rainfall Measuring Mission (TRMM Precipitation Radar (PR) over the tropical oceans show marked differences with estimates from two TRMM Microwave Imager (TMI) passive microwave algorithms. We show that path-integrated attenuation derived from the effects of precipitation on the radar return from the ocean surface exhibits interannual variability that agrees closely with the TMI time series. Further analysis of the frequency distribution of PR (2A25 product) rain rates suggests that the algorithm incorporates the attenuation measurement in a very conservative fashion so as to optimize the instantaneous rain rates. Such an optimization appears to come at the expense of monitoring interannual climate variability.

  15. Radiotherapy Dose Fractionation under Parameter Uncertainty

    NASA Astrophysics Data System (ADS)

    Davison, Matt; Kim, Daero; Keller, Harald

    2011-11-01

    In radiotherapy, radiation is directed to damage a tumor while avoiding surrounding healthy tissue. Tradeoffs ensue because dose cannot be exactly shaped to the tumor. It is particularly important to ensure that sensitive biological structures near the tumor are not damaged more than a certain amount. Biological tissue is known to have a nonlinear response to incident radiation. The linear quadratic dose response model, which requires the specification of two clinically and experimentally observed response coefficients, is commonly used to model this effect. This model yields an optimization problem giving two different types of optimal dose sequences (fractionation schedules). Which fractionation schedule is preferred depends on the response coefficients. These coefficients are uncertainly known and may differ from patient to patient. Because of this not only the expected outcomes but also the uncertainty around these outcomes are important, and it might not be prudent to select the strategy with the best expected outcome.

  16. A Framework for Modeling Emerging Diseases to Inform Management.

    PubMed

    Russell, Robin E; Katz, Rachel A; Richgels, Katherine L D; Walsh, Daniel P; Grant, Evan H C

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  17. Mercury study report to Congress. Volume 5. Health effects of mercury and mercury compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassett-Sipple, B.; Swartout, J.; Schoeny, R.

    1997-12-01

    This volume summarizes the available information on human health effects and animal data for hazard identification and dose-response assessment for three forms of mercury: elemental mercury, mercury chloride (inorganic mercury), and methylmercury (organic mercury). Effects are summarized by endpoint. The risk assessment evaluates carcinogenicity, mutagenicity, developmental toxicity and general systemic toxicity of these chemical species of mercury. Toxicokinetics (absorption, distribution, metabolism and excretion) are described for each of the three mercury species. Reference doses are calculated for inorganic and methylmercury; a reference concentrations for inhaled elemental mercury is provided. A quantitative analysis of factors contributing to variability and uncertainty inmore » the methylmercury RfD is provided in an appendix. Interactions and sensitive populations are described. the draft volume assesses ongoing research and research needs to reduce uncertainty surrounding adverse human health consequences of methylmercury exposure.« less

  18. Negotiating parental accountability in the face of uncertainty for attention-deficit hyperactivity disorder.

    PubMed

    Gray Brunton, Carol; McVittie, Chris; Ellison, Marion; Willock, Joyce

    2014-02-01

    Despite extensive research into attention-deficit hyperactivity disorder (ADHD), parents' constructions of their children's behaviors have received limited attention. This is particularly true outside North American contexts, where ADHD is less established historically. Our research demonstrates how United Kingdom parents made sense of ADHD and their own identities postdiagnosis. Using discourse analysis from interviews with 12 parents, we show that they drew from biological and social environmental repertoires when talking about their child's condition, paralleling repertoires found circulating in the United Kingdom media. However, in the context of parental narratives, both these repertoires were difficult for parents to support and involved problematic subject positions for parental accountability in the child's behavior. In this article we focus on the strategies parents used to negotiate these troublesome identities and construct accounts of moral and legitimate parenting in a context in which uncertainties surrounding ADHD existed and parenting was scrutinized.

  19. A framework for modeling emerging diseases to inform management

    USGS Publications Warehouse

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L. D.; Walsh, Daniel P.; Grant, Evan H. Campbell

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  20. Language of Uncertainty: the Expression of Decisional Conflict Related to Skin Cancer Prevention Recommendations.

    PubMed

    Strekalova, Yulia A; James, Vaughan S

    2017-09-01

    User-generated information on the Internet provides opportunities for the monitoring of health information consumer attitudes. For example, information about cancer prevention may cause decisional conflict. Yet posts and conversations shared by health information consumers online are often not readily actionable for interpretation and decision-making due to their unstandardized format. This study extends prior research on the use of natural language as a predictor of consumer attitudes and provides a link to decision-making by evaluating the predictive role of uncertainty indicators expressed in natural language. Analyzed data included free-text comments and structured scale responses related to information about skin cancer prevention options. The study identified natural language indicators of uncertainty and showed that it can serve as a predictor of decisional conflict. The natural indicators of uncertainty reported here can facilitate the monitoring of health consumer perceptions about cancer prevention recommendations and inform education and communication campaign planning and evaluation.

  1. Uncertainty of large-area estimates of indicators of forest structural gamma diversity: A study based on national forest inventory data

    Treesearch

    Susanne Winter; Andreas Böck; Ronald E. McRoberts

    2012-01-01

    Tree diameter and height are commonly measured forest structural variables, and indicators based on them are candidates for assessing forest diversity. We conducted our study on the uncertainty of estimates for mostly large geographic scales for four indicators of forest structural gamma diversity: mean tree diameter, mean tree height, and standard deviations of tree...

  2. A synthesis of carbon dioxide emissions from fossil-fuel combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andres, Robert Joseph; Boden, Thomas A; Breon, F.-M.

    2012-01-01

    This synthesis discusses the emissions of carbon dioxide from fossil-fuel combustion and cement production. While much is known about these emissions, there is still much that is unknown about the details surrounding these emissions. This synthesis explores 5 our knowledge of these emissions in terms of why there is concern about them; how they are calculated; the major global efforts on inventorying them; their global, regional, and national totals at different spatial and temporal scales; how they are distributed on global grids (i.e. maps); how they are transported in models; and the uncertainties associated with these different aspects of themore » emissions. The magnitude of emissions 10 from the combustion of fossil fuels has been almost continuously increasing with time since fossil fuels were first used by humans. Despite events in some nations specifically designed to reduce emissions, or which have had emissions reduction as a byproduct of other events, global total emissions continue their general increase with time. Global total fossil-fuel carbon dioxide emissions are known to within 10% uncertainty (95% 15 confidence interval). Uncertainty on individual national total fossil-fuel carbon dioxide emissions range from a few percent to more than 50 %. The information discussed in this manuscript synthesizes global, regional and national fossil-fuel carbon dioxide emissions, their distributions, their transport, and the associated uncertainties.« less

  3. Experimental research on the structural instability mechanism and the effect of multi-echelon support of deep roadways in a kilometre-deep well

    PubMed Central

    Peng, Rui; Zhao, Guangming; Li, Yingming; Zhu, Jianming

    2018-01-01

    We study the structural instability mechanism and effect of a multi-echelon support in very-deep roadways. We conduct a scale model test for analysing the structural failure mechanism and the effect of multi-echelon support of roadways under high horizontal stress. Mechanical bearing structures are classified according to their secondary stress distribution and the strength degradation of the surrounding rock after roadway excavation. A new method is proposed by partitioning the mechanical bearing structure of the surrounding rock into weak, key and main coupling bearing stratums. In the surrounding rock, the main bearing stratum is the plastic reshaping and flowing area. The weak bearing stratum is the peeling layer or the caving part. And the key bearing stratum is the shearing and yielding area. The structural fracture mechanism of roadways is considered in analysing the bearing structure instability of the surrounding rock, and multi-echelon support that considers the structural characteristics of roadway bearings is proposed. Results of the experimental study indicate that horizontal pressure seriously influences the stability of the surrounding rock, as indicated by extension of the weak bearing area and the transfer of the main and key bearing zones. The falling roof, rib spalling, and floor heave indicate the decline of the bearing capacity of surrounding rock, thereby causing roadway structural instability. Multi-echelon support is proposed according to the mechanical bearing structure of the surrounding rock without support. The redesigned support can reduce the scope of the weak bearing area and limit the transfer of the main and key bearing areas. Consequently, kilometre-deep roadway disasters, such as wedge roof caving, floor heave, and rib spalling, can be avoided to a certain degree, and plastic flow in the surrounding rock is relieved. The adverse effect of horizontal stress on the vault, spandrel and arch foot decreases. The stability of the soft rock surrounding the roadways is maintained. PMID:29447180

  4. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.

  5. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.

  6. Electrophysiological indices of surround suppression in humans

    PubMed Central

    Vanegas, M. Isabel; Blangero, Annabelle

    2014-01-01

    Surround suppression is a well-known example of contextual interaction in visual cortical neurophysiology, whereby the neural response to a stimulus presented within a neuron's classical receptive field is suppressed by surrounding stimuli. Human psychophysical reports present an obvious analog to the effects seen at the single-neuron level: stimuli are perceived as lower-contrast when embedded in a surround. Here we report on a visual paradigm that provides relatively direct, straightforward indices of surround suppression in human electrophysiology, enabling us to reproduce several well-known neurophysiological and psychophysical effects, and to conduct new analyses of temporal trends and retinal location effects. Steady-state visual evoked potentials (SSVEP) elicited by flickering “foreground” stimuli were measured in the context of various static surround patterns. Early visual cortex geometry and retinotopic organization were exploited to enhance SSVEP amplitude. The foreground response was strongly suppressed as a monotonic function of surround contrast. Furthermore, suppression was stronger for surrounds of matching orientation than orthogonally-oriented ones, and stronger at peripheral than foveal locations. These patterns were reproduced in psychophysical reports of perceived contrast, and peripheral electrophysiological suppression effects correlated with psychophysical effects across subjects. Temporal analysis of SSVEP amplitude revealed short-term contrast adaptation effects that caused the foreground signal to either fall or grow over time, depending on the relative contrast of the surround, consistent with stronger adaptation of the suppressive drive. This electrophysiology paradigm has clinical potential in indexing not just visual deficits but possibly gain control deficits expressed more widely in the disordered brain. PMID:25411464

  7. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  8. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  9. Modeling the value for money of changing clinical practice change: a stochastic application in diabetes care.

    PubMed

    Hoomans, Ties; Abrams, Keith R; Ament, Andre J H A; Evers, Silvia M A A; Severens, Johan L

    2009-10-01

    Decision making about resource allocation for guideline implementation to change clinical practice is inevitably undertaken in a context of uncertainty surrounding the cost-effectiveness of both clinical guidelines and implementation strategies. Adopting a total net benefit approach, a model was recently developed to overcome problems with the use of combined ratio statistics when analyzing decision uncertainty. To demonstrate the stochastic application of the model for informing decision making about the adoption of an audit and feedback strategy for implementing a guideline recommending intensive blood glucose control in type 2 diabetes in primary care in the Netherlands. An integrated Bayesian approach to decision modeling and evidence synthesis is adopted, using Markov Chain Monte Carlo simulation in WinBUGs. Data on model parameters is gathered from various sources, with effectiveness of implementation being estimated using pooled, random-effects meta-analysis. Decision uncertainty is illustrated using cost-effectiveness acceptability curves and frontier. Decisions about whether to adopt intensified glycemic control and whether to adopt audit and feedback alter for the maximum values that decision makers are willing to pay for health gain. Through simultaneously incorporating uncertain economic evidence on both guidance and implementation strategy, the cost-effectiveness acceptability curves and cost-effectiveness acceptability frontier show an increase in decision uncertainty concerning guideline implementation. The stochastic application in diabetes care demonstrates that the model provides a simple and useful tool for quantifying and exploring the (combined) uncertainty associated with decision making about adopting guidelines and implementation strategies and, therefore, for informing decisions about efficient resource allocation to change clinical practice.

  10. Uncertainty in Citizen Science observations: from measurement to user perception

    NASA Astrophysics Data System (ADS)

    Lahoz, William; Schneider, Philipp; Castell, Nuria

    2016-04-01

    Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.

  11. Parent Preferences for Shared Decision-making in Acute Versus Chronic Illness.

    PubMed

    Tom, Dina M; Aquino, Christian; Arredondo, Anthony R; Foster, Byron A

    2017-10-01

    The goal of this study was to examine preferences for shared decision-making (SDM) in parents of acutely ill versus chronically ill children in the inpatient setting. Additionally, we explored the effect of parental perception of illness severity and uncertainty in illness on decision-making preference. In this cross-sectional study, we surveyed parents of children admitted to pediatric inpatient units at an academic, tertiary-care hospital. Surveys were administered in person and used validated tools to assess SDM preferences and uncertainty in illness. Descriptive statistics evaluated associations stratified by acute versus chronic illness, and multivariable analyses were performed. Of the 200 parents who participated, the majority were women (78%), Hispanic (81.5%), English speaking (73%), between 30 and 39 years old (37.5%), and had an education achievement of less than a college degree (77%). The mean age of hospitalized children was 8.1 years, and half reported a chronic illness. Most parents preferred an active (43%) or collaborative (40%) role in SDM. There was no association with SDM preference by demographics, number of previous hospitalizations, perception of illness severity, or uncertainty. However, parents of chronically ill children significantly preferred a passive role in SDM when they perceived a high level of uncertainty in illness. Most parents of hospitalized children prefer to take an active or collaborative role in SDM. However, parents of chronically ill children who perceive high levels of uncertainty surrounding their children's illness prefer a passive role, thus illustrating the complexity in decision-making among this parent population. Copyright © 2017 by the American Academy of Pediatrics.

  12. Applications of explicitly-incorporated/post-processing measurement uncertainty in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...

  13. Eddy Current for Sizing Cracks in Canisters for Dry Storage of Used Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Ryan M.; Jones, Anthony M.; Pardini, Allan F.

    2014-01-01

    The storage of used nuclear fuel (UNF) in dry canister storage systems (DCSSs) at Independent Spent Fuel Storage Installations (ISFSI) sites is a temporary measure to accommodate UNF inventory until it can be reprocessed or transferred to a repository for permanent disposal. Policy uncertainty surrounding the long-term management of UNF indicates that DCSSs will need to store UNF for much longer periods than originally envisioned. Meanwhile, the structural and leak-tight integrity of DCSSs must not be compromised. The eddy current technique is presented as a potential tool for inspecting the outer surfaces of DCSS canisters for degradation, particularly atmospheric stressmore » corrosion cracking (SCC). Results are presented that demonstrate that eddy current can detect flaws that cannot be detected reliably using standard visual techniques. In addition, simulations are performed to explore the best parameters of a pancake coil probe for sizing of SCC flaws in DCSS canisters and to identify features in frequency sweep curves that may potentially be useful for facilitating accurate depth sizing of atmospheric SCC flaws from eddy current measurements.« less

  14. Eyes on crowding: crowding is preserved when responding by eye and similarly affects identity and position accuracy.

    PubMed

    Yildirim, Funda; Meyer, Vincent; Cornelissen, Frans W

    2015-02-16

    Peripheral vision guides recognition and selection of targets for eye movements. Crowding—a decline in recognition performance that occurs when a potential target is surrounded by other, similar, objects—influences peripheral object recognition. A recent model study suggests that crowding may be due to increased uncertainty about both the identity and the location of peripheral target objects, but very few studies have assessed these properties in tandem. Eye tracking can integrally provide information on both the perceived identity and the position of a target and therefore could become an important approach in crowding studies. However, recent reports suggest that around the moment of saccade preparation crowding may be significantly modified. If these effects were to generalize to regular crowding tasks, it would complicate the interpretation of results obtained with eye tracking and the comparison to results obtained using manual responses. For this reason, we first assessed whether the manner by which participants responded—manually or by eye—affected their performance. We found that neither recognition performance nor response time was affected by the response type. Hence, we conclude that crowding magnitude was preserved when observers responded by eye. In our main experiment, observers made eye movements to the location of a tilted Gabor target while we varied flanker tilt to manipulate target-flanker similarity. The results indicate that this similarly affected the accuracy of peripheral recognition and saccadic target localization. Our results inform about the importance of both location and identity uncertainty in crowding. © 2015 ARVO.

  15. Assessing the Problem Formulation in an Integrated Assessment Model: Implications for Climate Policy Decision-Support

    NASA Astrophysics Data System (ADS)

    Garner, G. G.; Reed, P. M.; Keller, K.

    2014-12-01

    Integrated assessment models (IAMs) are often used with the intent to aid in climate change decisionmaking. Numerous studies have analyzed the effects of parametric and/or structural uncertainties in IAMs, but uncertainties regarding the problem formulation are often overlooked. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the problem formulation. The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decisionmakers, however, may be concerned with a broader range of values and preferences that are not captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing both (ii) the costs of abatement and (iii) the damages due to climate change. We derive a set of Pareto-optimal solutions over which decisionmakers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  16. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses

    PubMed Central

    Wong, Tony E.; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095

  17. A diversity index for model space selection in the estimation of benchmark and infectious doses via model averaging.

    PubMed

    Kim, Steven B; Kodell, Ralph L; Moon, Hojin

    2014-03-01

    In chemical and microbial risk assessments, risk assessors fit dose-response models to high-dose data and extrapolate downward to risk levels in the range of 1-10%. Although multiple dose-response models may be able to fit the data adequately in the experimental range, the estimated effective dose (ED) corresponding to an extremely small risk can be substantially different from model to model. In this respect, model averaging (MA) provides more robustness than a single dose-response model in the point and interval estimation of an ED. In MA, accounting for both data uncertainty and model uncertainty is crucial, but addressing model uncertainty is not achieved simply by increasing the number of models in a model space. A plausible set of models for MA can be characterized by goodness of fit and diversity surrounding the truth. We propose a diversity index (DI) to balance between these two characteristics in model space selection. It addresses a collective property of a model space rather than individual performance of each model. Tuning parameters in the DI control the size of the model space for MA. © 2013 Society for Risk Analysis.

  18. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  19. Impact of uncertainty in soil, climatic, and chemical information in a pesticide leaching assessment

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Green, Richard E.; Giambelluca, Thomas W.; Liang, Tony C.; Yost, Russell S.

    1990-01-01

    A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyul)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor ( AF); it requires soil, hydrogeologic, climatic and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices.

  20. Erratum to "Impact of uncertainty in soil, climatic, and chemical information in a pesticide leaching assessment"

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Green, Richard E.; Giambelluca, Thomas W.; Liang, Tony C.; Yost, Russell S.

    2016-11-01

    A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyl)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor (AF); it requires soil, hydrogeologic, climatic, and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices.

  1. [Application of robustness test for assessment of the measurement uncertainty at the end of development phase of a chromatographic method for quantification of water-soluble vitamins].

    PubMed

    Ihssane, B; Bouchafra, H; El Karbane, M; Azougagh, M; Saffaj, T

    2016-05-01

    We propose in this work an efficient way to evaluate the measurement of uncertainty at the end of the development step of an analytical method, since this assessment provides an indication of the performance of the optimization process. The estimation of the uncertainty is done through a robustness test by applying a Placquett-Burman design, investigating six parameters influencing the simultaneous chromatographic assay of five water-soluble vitamins. The estimated effects of the variation of each parameter are translated into standard uncertainty value at each concentration level. The values obtained of the relative uncertainty do not exceed the acceptance limit of 5%, showing that the procedure development was well done. In addition, a statistical comparison conducted to compare standard uncertainty after the development stage and those of the validation step indicates that the estimated uncertainty are equivalent. The results obtained show clearly the performance and capacity of the chromatographic method to simultaneously assay the five vitamins and suitability for use in routine application. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  2. Optimizing integrated airport surface and terminal airspace operations under uncertainty

    NASA Astrophysics Data System (ADS)

    Bosson, Christabelle S.

    In airports and surrounding terminal airspaces, the integration of surface, arrival and departure scheduling and routing have the potential to improve the operations efficiency. Moreover, because both the airport surface and the terminal airspace are often altered by random perturbations, the consideration of uncertainty in flight schedules is crucial to improve the design of robust flight schedules. Previous research mainly focused on independently solving arrival scheduling problems, departure scheduling problems and surface management scheduling problems and most of the developed models are deterministic. This dissertation presents an alternate method to model the integrated operations by using a machine job-shop scheduling formulation. A multistage stochastic programming approach is chosen to formulate the problem in the presence of uncertainty and candidate solutions are obtained by solving sample average approximation problems with finite sample size. The developed mixed-integer-linear-programming algorithm-based scheduler is capable of computing optimal aircraft schedules and routings that reflect the integration of air and ground operations. The assembled methodology is applied to a Los Angeles case study. To show the benefits of integrated operations over First-Come-First-Served, a preliminary proof-of-concept is conducted for a set of fourteen aircraft evolving under deterministic conditions in a model of the Los Angeles International Airport surface and surrounding terminal areas. Using historical data, a representative 30-minute traffic schedule and aircraft mix scenario is constructed. The results of the Los Angeles application show that the integration of air and ground operations and the use of a time-based separation strategy enable both significant surface and air time savings. The solution computed by the optimization provides a more efficient routing and scheduling than the First-Come-First-Served solution. Additionally, a data driven analysis is performed for the Los Angeles environment and probabilistic distributions of pertinent uncertainty sources are obtained. A sensitivity analysis is then carried out to assess the methodology performance and find optimal sampling parameters. Finally, simulations of increasing traffic density in the presence of uncertainty are conducted first for integrated arrivals and departures, then for integrated surface and air operations. To compare the optimization results and show the benefits of integrated operations, two aircraft separation methods are implemented that offer different routing options. The simulations of integrated air operations and the simulations of integrated air and surface operations demonstrate that significant traveling time savings, both total and individual surface and air times, can be obtained when more direct routes are allowed to be traveled even in the presence of uncertainty. The resulting routings induce however extra take off delay for departing flights. As a consequence, some flights cannot meet their initial assigned runway slot which engenders runway position shifting when comparing resulting runway sequences computed under both deterministic and stochastic conditions. The optimization is able to compute an optimal runway schedule that represents an optimal balance between total schedule delays and total travel times.

  3. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  4. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    PubMed

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  5. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  6. Uncertainty evaluation of dead zone of diagnostic ultrasound equipment

    NASA Astrophysics Data System (ADS)

    Souza, R. M.; Alvarenga, A. V.; Braz, D. S.; Petrella, L. I.; Costa-Felix, R. P. B.

    2016-07-01

    This paper presents a model for evaluating measurement uncertainty of a feature used in the assessment of ultrasound images: dead zone. The dead zone was measured by two technicians of the INMETRO's Laboratory of Ultrasound using a phantom and following the standard IEC/TS 61390. The uncertainty model was proposed based on the Guide to the Expression of Uncertainty in Measurement. For the tested equipment, results indicate a dead zone of 1.01 mm, and based on the proposed model, the expanded uncertainty was 0.17 mm. The proposed uncertainty model contributes as a novel way for metrological evaluation of diagnostic imaging by ultrasound.

  7. Deriving persistence indicators from regulatory water-sediment studies – opportunities and limitations in OECD 308 data.

    PubMed

    Honti, Mark; Fenner, Kathrin

    2015-05-19

    The OECD guideline 308 describes a laboratory test method to assess aerobic and anaerobic transformation of organic chemicals in aquatic sediment systems and is an integral part of tiered testing strategies in different legislative frameworks for the environmental risk assessment of chemicals. The results from experiments carried out according to OECD 308 are generally used to derive persistence indicators for hazard assessment or half-lives for exposure assessment. We used Bayesian parameter estimation and system representations of various complexities to systematically assess opportunities and limitations for estimating these indicators from existing data generated according to OECD 308 for 23 pesticides and pharmaceuticals. We found that there is a disparity between the uncertainty and the conceptual robustness of persistence indicators. Disappearance half-lives are directly extractable with limited uncertainty, but they lump degradation and phase transfer information and are not robust against changes in system geometry. Transformation half-lives are less system-specific but require inverse modeling to extract, resulting in considerable uncertainty. Available data were thus insufficient to derive indicators that had both acceptable robustness and uncertainty, which further supports previously voiced concerns about the usability and efficiency of these costly experiments. Despite the limitations of existing data, we suggest the time until 50% of the parent compound has been transformed in the entire system (DegT(50,system)) could still be a useful indicator of persistence in the upper, partially aerobic sediment layer in the context of PBT assessment. This should, however, be accompanied by a mandatory reporting or full standardization of the geometry of the experimental system. We recommend transformation half-lives determined by inverse modeling to be used as input parameters into fate models for exposure assessment, if due consideration is given to their uncertainty.

  8. An approach to forecasting health expenditures, with application to the U.S. Medicare system.

    PubMed

    Lee, Ronald; Miller, Timoth

    2002-10-01

    To quantify uncertainty in forecasts of health expenditures. Stochastic time series models are estimated for historical variations in fertility, mortality, and health spending per capita in the United States, and used to generate stochastic simulations of the growth of Medicare expenditures. Individual health spending is modeled to depend on the number of years until death. A simple accounting model is developed for forecasting health expenditures, using the U.S. Medicare system as an example. Medicare expenditures are projected to rise from 2.2 percent of GDP (gross domestic product) to about 8 percent of GDP by 2075. This increase is due in equal measure to increasing health spending per beneficiary and to population aging. The traditional projection method constructs high, medium, and low scenarios to assess uncertainty, an approach that has many problems. Using stochastic forecasting, we find a 95 percent probability that Medicare spending in 2075 will fall between 4 percent and 18 percent of GDP, indicating a wide band of uncertainty. Although there is substantial uncertainty about future mortality decline, it contributed little to uncertainty about future Medicare spending, since lower mortality both raises the number of elderly, tending to raise spending, and is associated with improved health of the elderly, tending to reduce spending. Uncertainty about fertility, by contrast, leads to great uncertainty about the future size of the labor force, and therefore adds importantly to uncertainty about the health-share of GDP. In the shorter term, the major source of uncertainty is health spending per capita. History is a valuable guide for quantifying our uncertainty about future health expenditures. The probabilistic model we present has several advantages over the high-low scenario approach to forecasting. It indicates great uncertainty about future Medicare expenditures relative to GDP.

  9. Mapping Uncertainty Due to Missing Data in the Global Ocean Health Index.

    PubMed

    Frazier, Melanie; Longo, Catherine; Halpern, Benjamin S

    2016-01-01

    Indicators are increasingly used to measure environmental systems; however, they are often criticized for failing to measure and describe uncertainty. Uncertainty is particularly difficult to evaluate and communicate in the case of composite indicators which aggregate many indicators of ecosystem condition. One of the ongoing goals of the Ocean Health Index (OHI) has been to improve our approach to dealing with missing data, which is a major source of uncertainty. Here we: (1) quantify the potential influence of gapfilled data on index scores from the 2015 global OHI assessment; (2) develop effective methods of tracking, quantifying, and communicating this information; and (3) provide general guidance for implementing gapfilling procedures for existing and emerging indicators, including regional OHI assessments. For the overall OHI global index score, the percent contribution of gapfilled data was relatively small (18.5%); however, it varied substantially among regions and goals. In general, smaller territorial jurisdictions and the food provision and tourism and recreation goals required the most gapfilling. We found the best approach for managing gapfilled data was to mirror the general framework used to organize, calculate, and communicate the Index data and scores. Quantifying gapfilling provides a measure of the reliability of the scores for different regions and components of an indicator. Importantly, this information highlights the importance of the underlying datasets used to calculate composite indicators and can inform and incentivize future data collection.

  10. The Significance of an Excess in a Counting Experiment: Assessing the Impact of Systematic Uncertainties and the Case with a Gaussian Background

    NASA Astrophysics Data System (ADS)

    Vianello, Giacomo

    2018-05-01

    Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (“on” measurement) is contrasted with a background-only observation free of the effect (“off” measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely used formula from Li & Ma, which assumes that both measurements are Poisson random variables. In this paper we study three other cases: (i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, (ii) the case where the background estimate b in the off measurement has an additional systematic uncertainty, and (iii) the case where b is a Gaussian random variable instead of a Poisson random variable. The latter case applies when b comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use a formula that is only valid when b is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short gamma-ray bursts and of new X-ray or γ-ray sources. All the techniques presented in this paper are made available in a Python code that is ready to use.

  11. Detectability of change in winter precipitation within mountain landscapes: Spatial patterns and uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, N. L.; Maneta, M. P.

    2016-06-01

    Detecting long-term change in seasonal precipitation using ground observations is dependent on the representativity of the point measurement to the surrounding landscape. In mountainous regions, representativity can be poor and lead to large uncertainties in precipitation estimates at high elevations or in areas where observations are sparse. If the uncertainty in the estimate is large compared to the long-term shifts in precipitation, then the change will likely go undetected. In this analysis, we examine the minimum detectable change across mountainous terrain in western Montana, USA. We ask the question: What is the minimum amount of change that is necessary to be detected using our best estimates of precipitation in complex terrain? We evaluate the spatial uncertainty in the precipitation estimates by conditioning historic regional climate model simulations to ground observations using Bayesian inference. By using this uncertainty as a null hypothesis, we test for detectability across the study region. To provide context for the detectability calculations, we look at a range of future scenarios from the Coupled Model Intercomparison Project 5 (CMIP5) multimodel ensemble downscaled to 4 km resolution using the MACAv2-METDATA data set. When using the ensemble averages we find that approximately 65% of the significant increases in winter precipitation go undetected at midelevations. At high elevation, approximately 75% of significant increases in winter precipitation are undetectable. Areas where change can be detected are largely controlled by topographic features. Elevation and aspect are key characteristics that determine whether or not changes in winter precipitation can be detected. Furthermore, we find that undetected increases in winter precipitation at high elevation will likely remain as snow under climate change scenarios. Therefore, there is potential for these areas to offset snowpack loss at lower elevations and confound the effects of climate change on water resources.

  12. Cued uncertainty modulates later recognition of emotional pictures: An ERP study.

    PubMed

    Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua

    2017-06-01

    Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.

  13. Uncertainty in assessment of radiation-induced diffusion index changes in individual patients

    NASA Astrophysics Data System (ADS)

    Nazem-Zadeh, Mohammad-Reza; Chapman, Christopher H.; Lawrence, Theodore S.; Tsien, Christina I.; Cao, Yue

    2013-06-01

    The purpose of this study is to evaluate repeatability coefficients of diffusion tensor indices to assess whether longitudinal changes in diffusion indices were true changes beyond the uncertainty for individual patients undergoing radiation therapy (RT). Twenty-two patients who had low-grade or benign tumors and were treated by partial brain radiation therapy (PBRT) participated in an IRB-approved MRI protocol. The diffusion tensor images in the patients were acquired pre-RT, week 3 during RT, at the end of RT, and 1, 6, and 18 months after RT. As a measure of uncertainty, repeatability coefficients (RC) of diffusion indices in the segmented cingulum, corpus callosum, and fornix were estimated by using test-retest diffusion tensor datasets from the National Biomedical Imaging Archive (NBIA) database. The upper and lower limits of the 95% confidence interval of the estimated RC from the test and retest data were used to evaluate whether the longitudinal percentage changes in diffusion indices in the segmented structures in the individual patients were beyond the uncertainty and thus could be considered as true radiation-induced changes. Diffusion indices in different white matter structures showed different uncertainty ranges. The estimated RC for fractional anisotropy (FA) ranged from 5.3% to 9.6%, for mean diffusivity (MD) from 2.2% to 6.8%, for axial diffusivity (AD) from 2.4% to 5.5%, and for radial diffusivity (RD) from 2.9% to 9.7%. Overall, 23% of the patients treated by RT had FA changes, 44% had MD changes, 50% had AD changes, and 50% had RD changes beyond the uncertainty ranges. In the fornix, 85.7% and 100% of the patients showed changes beyond the uncertainty range at 6 and 18 months after RT, demonstrating that radiation has a pronounced late effect on the fornix compared to other segmented structures. It is critical to determine reliability of a change observed in an individual patient for clinical decision making. Assessments of the repeatability and confidence interval of diffusion tensor measurements in white matter structures allow us to determine the true longitudinal change in individual patients.

  14. Facing the Unknown: Intolerance of Uncertainty in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Hodgson, Anna R.; Freeston, Mark H.; Honey, Emma; Rodgers, Jacqui

    2017-01-01

    Background: Anxiety is a common problem for children with autism spectrum disorder (ASD). Recent research indicates that intolerance of uncertainty (IU) may be an important aspect of anxiety for this population. IU is the belief that uncertainty is upsetting, and not knowing what is going to happen is negative. There is little known about the…

  15. Toward evaluating the effect of climate change on investments in the water resources sector: insights from the forecast and analysis of hydrological indicators in developing countries

    NASA Astrophysics Data System (ADS)

    Strzepek, Kenneth; Jacobsen, Michael; Boehlert, Brent; Neumann, James

    2013-12-01

    The World Bank has recently developed a method to evaluate the effects of climate change on six hydrological indicators across 8951 basins of the world. The indicators are designed for decision-makers and stakeholders to consider climate risk when planning water resources and related infrastructure investments. Analysis of these hydrological indicators shows that, on average, mean annual runoff will decline in southern Europe; most of Africa; and in southern North America and most of Central and South America. Mean reference crop water deficit, on the other hand, combines temperature and precipitation and is anticipated to increase in nearly all locations globally due to rising global temperatures, with the most dramatic increases projected to occur in southern Europe, southeastern Asia, and parts of South America. These results suggest overall guidance on which regions to focus water infrastructure solutions that could address future runoff flow uncertainty. Most important, we find that uncertainty in projections of mean annual runoff and high runoff events is higher in poorer countries, and increases over time. Uncertainty increases over time for all income categories, but basins in the lower and lower-middle income categories are forecast to experience dramatically higher increases in uncertainty relative to those in the upper-middle and upper income categories. The enhanced understanding of the uncertainty of climate projections for the water sector that this work provides strongly support the adoption of rigorous approaches to infrastructure design under uncertainty, as well as design that incorporates a high degree of flexibility, in response to both risk of damage and opportunity to exploit water supply ‘windfalls’ that might result, but would require smart infrastructure investments to manage to the greatest benefit.

  16. Developing a non-point source P loss indicator in R and its parameter uncertainty assessment using GLUE: a case study in northern China.

    PubMed

    Su, Jingjun; Du, Xinzhong; Li, Xuyong

    2018-05-16

    Uncertainty analysis is an important prerequisite for model application. However, the existing phosphorus (P) loss indexes or indicators were rarely evaluated. This study applied generalized likelihood uncertainty estimation (GLUE) method to assess the uncertainty of parameters and modeling outputs of a non-point source (NPS) P indicator constructed in R language. And the influences of subjective choices of likelihood formulation and acceptability threshold of GLUE on model outputs were also detected. The results indicated the following. (1) Parameters RegR 2 , RegSDR 2 , PlossDP fer , PlossDP man , DPDR, and DPR were highly sensitive to overall TP simulation and their value ranges could be reduced by GLUE. (2) Nash efficiency likelihood (L 1 ) seemed to present better ability in accentuating high likelihood value simulations than the exponential function (L 2 ) did. (3) The combined likelihood integrating the criteria of multiple outputs acted better than single likelihood in model uncertainty assessment in terms of reducing the uncertainty band widths and assuring the fitting goodness of whole model outputs. (4) A value of 0.55 appeared to be a modest choice of threshold value to balance the interests between high modeling efficiency and high bracketing efficiency. Results of this study could provide (1) an option to conduct NPS modeling under one single computer platform, (2) important references to the parameter setting for NPS model development in similar regions, (3) useful suggestions for the application of GLUE method in studies with different emphases according to research interests, and (4) important insights into the watershed P management in similar regions.

  17. Erratum to "Impact of uncertainty in soil, climatic, and chemical information in a pesticide leaching assessment".

    PubMed

    Loague, Keith; Green, Richard E; Giambelluca, Thomas W; Liang, Tony C; Yost, Russell S

    2016-11-01

    A simple mobility index, when combined with a geographic information system, can be used to generate rating maps which indicate qualitatively the potential for various organic chemicals to leach to groundwater. In this paper we investigate the magnitude of uncertainty associated with pesticide mobility estimates as a result of data uncertainties. Our example is for the Pearl Harbor Basin, Oahu, Hawaii. The two pesticides included in our analysis are atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine) and diuron [3-(3,4-dichlorophenyl)-1,1-dimethylarea]. The mobility index used here is known as the Attenuation Factor (AF); it requires soil, hydrogeologic, climatic, and chemical information as input data. We employ first-order uncertainty analysis to characterize the uncertainty in estimates of AF resulting from uncertainties in the various input data. Soils in the Pearl Harbor Basin are delineated at the order taxonomic category for this study. Our results show that there can be a significant amount of uncertainty in estimates of pesticide mobility for the Pearl Harbor Basin. This information needs to be considered if future decisions concerning chemical regulation are to be based on estimates of pesticide mobility determined from simple indices. Copyright © 2016. Published by Elsevier B.V.

  18. Characterizing the velocity of a wandering black hole and properties of the surrounding medium using convolutional neural networks

    NASA Astrophysics Data System (ADS)

    González, J. A.; Guzmán, F. S.

    2018-03-01

    We present a method for estimating the velocity of a wandering black hole and the equation of state for the gas around it based on a catalog of numerical simulations. The method uses machine-learning methods based on convolutional neural networks applied to the classification of images resulting from numerical simulations. Specifically we focus on the supersonic velocity regime and choose the direction of the black hole to be parallel to its spin. We build a catalog of 900 simulations by numerically solving Euler's equations onto the fixed space-time background of a black hole, for two parameters: the adiabatic index Γ with values in the range [1.1, 5 /3 ], and the asymptotic relative velocity of the black hole with respect to the surroundings v∞, with values within [0.2 ,0.8 ]c . For each simulation we produce a 2D image of the gas density once the process of accretion has approached a stationary regime. The results obtained show that the implemented convolutional neural networks are able to correctly classify the adiabatic index 87.78% of the time within an uncertainty of ±0.0284 , while the prediction of the velocity is correct 96.67% of the time within an uncertainty of ±0.03 c . We expect that this combination of a massive number of numerical simulations and machine-learning methods will help us analyze more complicated scenarios related to future high-resolution observations of black holes, like those from the Event Horizon Telescope.

  19. Cell-free fetal DNA testing: a pilot study of obstetric healthcare provider attitudes toward clinical implementation.

    PubMed

    Sayres, Lauren C; Allyse, Megan; Norton, Mary E; Cho, Mildred K

    2011-11-01

    To provide a preliminary assessment of obstetric healthcare provider opinions surrounding implementation of cell-free fetal DNA testing. A 37-question pilot survey was used to address questions around the translation and use of non-invasive prenatal testing using cell-free fetal DNA. The survey was distributed and collected at a Continuing Medical Education course on obstetrics and gynecology. Of 62 survey respondents, 73% were female and 87% held MD/DO degrees. Respondents generally agreed that patients want prenatal diagnostic information to help make decisions about a pregnancy and that cell-free fetal DNA testing would encourage the testing of more patients for more conditions. However, there was an overall lack of knowledge or conviction about using this technology. Genetic counseling and professional society approval were deemed important to implementation, whereas the possibility of direct-to-consumer testing and government regulation produced mixed responses. Respondents indicated that they would be more likely to offer cell-free fetal DNA testing for chromosomal abnormalities and single-gene disorders, but would be cautious with respect to determination of sex and behavioral or late-onset conditions. Preliminary assessment indicates uncertainty among obstetric providers about the details of implementing cell-free fetal DNA testing and suggests expanded research on perspectives of this stakeholder group. Copyright © 2011 John Wiley & Sons, Ltd.

  20. "It is caused of the womans part or of the mans part": the role of gender in the diagnosis and treatment of sexual dysfunction in early modern England.

    PubMed

    Evans, Jennifer

    2011-01-01

    Philip Barrough wrote in 1590 that barrenness 'is caused of the womans part or of the mans part'. By the eighteenth century, however, barrenness was perceived as a female disorder distinguished from male impotence. Few historians have addressed the uncertainty surrounding early modern definitions of infertility, choosing instead to adopt set terms that fit comfortably with modern ideas. This article will highlight the difficulties surrounding the gender distinction of the terms 'barrenness' and 'impotence' during this period. Moreover, the discussion will examine the role of gender in diagnosing these disorders to sufferers. The article will argue that ideas of gender were more central to diagnosis of poor sexual health than to effectual treatment. Although it appears that barrenness and impotence were treated with separate remedies, many treatments were described as effectual for both sexes. Additionally, the ingredients used in such recipes were often sexual stimulants explained without reference to gender.

  1. Developing a 1 km resolution daily air temperature dataset for urban and surrounding areas in the conterminous United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.

    High spatiotemporal resolution air temperature (Ta) datasets are increasingly needed for assessing the impact of temperature change on people, ecosystems, and energy system, especially in the urban domains. However, such datasets are not widely available because of the large spatiotemporal heterogeneity of Ta caused by complex biophysical and socioeconomic factors such as built infrastructure and human activities. In this study, we developed a 1-km gridded dataset of daily minimum Ta (Tmin) and maximum Ta (Tmax), and the associated uncertainties, in urban and surrounding areas in the conterminous U.S. for the 2003–2016 period. Daily geographically weighted regression (GWR) models were developedmore » and used to interpolate Ta using 1 km daily land surface temperature and elevation as explanatory variables. The leave-one-out cross-validation approach indicates that our method performs reasonably well, with root mean square errors of 2.1 °C and 1.9 °C, mean absolute errors of 1.5 °C and 1.3 °C, and R 2 of 0.95 and 0.97, for Tmin and Tmax, respectively. The resulting dataset captures reasonably the spatial heterogeneity of Ta in the urban areas, and also captures effectively the urban heat island (UHI) phenomenon that Ta rises with the increase of urban development (i.e., impervious surface area). The new dataset is valuable for studying environmental impacts of urbanization such as UHI and other related effects (e.g., on building energy consumption and human health). The proposed methodology also shows a potential to build a long-term record of Ta worldwide, to fill the data gap that currently exists for studies of urban systems.« less

  2. Managing Uncertainty during a Corporate Acquisition: A Longitudinal Study of Communication During an Airline Acquisition

    ERIC Educational Resources Information Center

    Kramer, Michael W.; Dougherty, Debbie S.; Pierce, Tamyra A.

    2004-01-01

    This study examined pilots' (N at T1 = 140; N at T2 = 126; N at T3 = 104) reactions to communication and uncertainty during the acquisition of their airline by another airline. Quantitative results indicate that communication helped to reduce uncertainty and was predictive of affective responses to the acquisition. However, contrary to…

  3. Using a Pareto-optimal solution set to characterize trade-offs between a broad range of values and preferences in climate risk management

    NASA Astrophysics Data System (ADS)

    Garner, Gregory; Reed, Patrick; Keller, Klaus

    2015-04-01

    Integrated assessment models (IAMs) are often used to inform the design of climate risk management strategies. Previous IAM studies have broken important new ground on analyzing the effects of parametric uncertainties, but they are often silent on the implications of uncertainties regarding the problem formulation. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the definition of the objective(s). The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decision makers, however, are often concerned with a broader range of values and preferences that may be poorly captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing (ii) the costs of abatement and (iii) the climate change damages. We use advanced multi-objective optimization methods to derive a set of Pareto-optimal solutions over which decision makers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  4. Addressing the "Replication Crisis": Using Original Studies to Design Replication Studies with Appropriate Statistical Power.

    PubMed

    Anderson, Samantha F; Maxwell, Scott E

    2017-01-01

    Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.

  5. The hidden truths of the belly: the uncertainties of pregnancy in early modern Europe. (Society for the Social History of Medicine Student Prize Essay 1999, runner-up.).

    PubMed

    McClive, Cathy

    2002-08-01

    For early modern men and women and their medical practitioners, the experience and understanding of pregnancy was primarily uncertain. This uncertainty extended to the whole process of pregnancy--from the moment of conception to delivery, the detection and bearing of a 'true fruit' was doubtful. This 'uncertainty' was heightened by the fact that both body and language could conceal the truth. The woman herself was frequently uncertain and could be mistaken in her interpretation of the condition of her belly. This ambiguity is expressed in the vague and faltering language used to describe such experiences. Women's bodies were believed to conceal the truth more readily than their male counterparts. Equally a woman's physical narrative was more likely to be distrusted. Tensions surrounding the appropriate nature of women's 'knowledge' of such hidden 'secrets' also affected the ways in which women and their practitioners described the 'truths' of the belly. This article traces the ambiguities faced by women and their midwives/accoucheurs through three areas of pregnancy: quickening, false conceptions, and the threat of miscarriage. The much-neglected source of medical texts and observations is drawn upon, alongside letters and diaries and judicial material.

  6. Evaluating land cover influences on model uncertainties—A case study of cropland carbon dynamics in the Mid-Continent Intensive Campaign region

    USGS Publications Warehouse

    Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun

    2016-01-01

    Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.

  7. How to perform a cost-effectiveness analysis with surrogate endpoint: renal denervation in patients with resistant hypertension (DENERHTN) trial as an example.

    PubMed

    Bulsei, Julie; Darlington, Meryl; Durand-Zaleski, Isabelle; Azizi, Michel

    2018-04-01

    Whilst much uncertainty exists as to the efficacy of renal denervation (RDN), the positive results of the DENERHTN study in France confirmed the interest of an economic evaluation in order to assess efficiency of RDN and inform local decision makers about the costs and benefits of this intervention. The uncertainty surrounding both the outcomes and the costs can be described using health economic methods such as the non-parametric bootstrap. Internationally, numerous health economic studies using a cost-effectiveness model to assess the impact of RDN in terms of cost and effectiveness compared to antihypertensive medical treatment have been conducted. The DENERHTN cost-effectiveness study was the first health economic evaluation specifically designed to assess the cost-effectiveness of RDN using individual data. Using the DENERHTN results as an example, we provide here a summary of the principle methods used to perform a cost-effectiveness analysis.

  8. The Uptake of Heat and Carbon by the Southern Ocean in the CMIP5 Earth System Models

    NASA Astrophysics Data System (ADS)

    Russell, J. L.; Stouffer, R. J.; Dunne, J. P.; John, J. G.

    2011-12-01

    The Southern Ocean surrounding the Antarctic continent accounts for a disproportionate share of the heat and carbon dioxide that is removed from contact with the atmosphere into the ocean. The vigorous air-sea exchange driven by the Southern Hemisphere Westerlies, combined with the dearth of observations, makes the Southern Ocean a major source of uncertainty in projecting the rate of warming of our atmosphere, especially considering that the vertical mixing of the ocean and the corollary air-sea fluxes may be vulnerable to climate change. We assess the heat and carbon uptake by the Southern Ocean in future simulations by the IPCC-AR5 Earth System Models (ESMs), focusing on the GFDL simulations. Using the 1860 control simulation as our baseline, we explore the differences in heat and carbon uptake between the major "Representative Concentration Pathways" (RCPs) as simulated by the various ESMs in order to quantify the uncertainties in the climate projections related to the Southern Ocean window into the deep ocean reservoir.

  9. Adaptive management for soil ecosystem services

    USGS Publications Warehouse

    Birge, Hannah E.; Bevans, Rebecca A.; Allen, Craig R.; Angeler, David G.; Baer, Sara G.; Wall, Diana H.

    2016-01-01

    Ecosystem services provided by soil include regulation of the atmosphere and climate, primary (including agricultural) production, waste processing, decomposition, nutrient conservation, water purification, erosion control, medical resources, pest control, and disease mitigation. The simultaneous production of these multiple services arises from complex interactions among diverse aboveground and belowground communities across multiple scales. When a system is mismanaged, non-linear and persistent losses in ecosystem services can arise. Adaptive management is an approach to management designed to reduce uncertainty as management proceeds. By developing alternative hypotheses, testing these hypotheses and adjusting management in response to outcomes, managers can probe dynamic mechanistic relationships among aboveground and belowground soil system components. In doing so, soil ecosystem services can be preserved and critical ecological thresholds avoided. Here, we present an adaptive management framework designed to reduce uncertainty surrounding the soil system, even when soil ecosystem services production is not the explicit management objective, so that managers can reach their management goals without undermining soil multifunctionality or contributing to an irreversible loss of soil ecosystem services.

  10. Uncertainty in Arctic climate projections traced to variability of downwelling longwave radiation

    NASA Astrophysics Data System (ADS)

    Krikken, Folmer; Bintanja, Richard; Hazeleger, WIlco; van Heerwaarden, Chiel

    2017-04-01

    The Arctic region has warmed rapidly over the last decades, and this warming is projected to increase. The uncertainty in these projections, i.e. intermodel spread, is however very large and a clear understanding of the sources behind the spread is so far still lacking. Here we use 31 state-of-the-art global climate models to show that variability of May downwelling radiation (DLR) in the models' control climate, primarily located at the land surrounding the Arctic ocean, explains 2/3 of the intermodel spread in projected Arctic warming under the RPC85 scenario. This variability is related to the combined radiative effect of the cloud radiative forcing (CRF) and the albedo response due to snowfall, which varies strongly between the models in these regions. This mechanism dampens or enhances yearly variability of DLR in the control climate but also dampens or enhances the climate response of DLR, sea ice cover and near surface temperature.

  11. A statistical approach to the brittle fracture of a multi-phase solid

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Lua, Y. I.; Belytschko, T.

    1991-01-01

    A stochastic damage model is proposed to quantify the inherent statistical distribution of the fracture toughness of a brittle, multi-phase solid. The model, based on the macrocrack-microcrack interaction, incorporates uncertainties in locations and orientations of microcracks. Due to the high concentration of microcracks near the macro-tip, a higher order analysis based on traction boundary integral equations is formulated first for an arbitrary array of cracks. The effects of uncertainties in locations and orientations of microcracks at a macro-tip are analyzed quantitatively by using the boundary integral equations method in conjunction with the computer simulation of the random microcrack array. The short range interactions resulting from surrounding microcracks closet to the main crack tip are investigated. The effects of microcrack density parameter are also explored in the present study. The validity of the present model is demonstrated by comparing its statistical output with the Neville distribution function, which gives correct fits to sets of experimental data from multi-phase solids.

  12. Investments in energy technological change under uncertainty

    NASA Astrophysics Data System (ADS)

    Shittu, Ekundayo

    2009-12-01

    This dissertation addresses the crucial problem of how environmental policy uncertainty influences investments in energy technological change. The rising level of carbon emissions due to increasing global energy consumption calls for policy shift. In order to stem the negative consequences on the climate, policymakers are concerned with carving an optimal regulation that will encourage technology investments. However, decision makers are facing uncertainties surrounding future environmental policy. The first part considers the treatment of technological change in theoretical models. This part has two purposes: (1) to show--through illustrative examples--that technological change can lead to quite different, and surprising, impacts on the marginal costs of pollution abatement. We demonstrate an intriguing and uncommon result that technological change can increase the marginal costs of pollution abatement over some range of abatement; (2) to show the impact, on policy, of this uncommon observation. We find that under the assumption of technical change that can increase the marginal cost of pollution abatement over some range, the ranking of policy instruments is affected. The second part builds on the first by considering the impact of uncertainty in the carbon tax on investments in a portfolio of technologies. We determine the response of energy R&D investments as the carbon tax increases both in terms of overall and technology-specific investments. We determine the impact of risk in the carbon tax on the portfolio. We find that the response of the optimal investment in a portfolio of technologies to an increasing carbon tax depends on the relative costs of the programs and the elasticity of substitution between fossil and non-fossil energy inputs. In the third part, we zoom-in on the portfolio model above to consider how uncertainty in the magnitude and timing of a carbon tax influences investments. Under a two-stage continuous-time optimal control model, we consider the impact of these uncertainties on R&D spending that aims to lower the cost of non-fossil energy technology. We find that our results tally with the classical results because it discourages near-term investment. However, timing uncertainty increases near-term investment.

  13. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and longer treatment times for update time above 5 seconds. Overall evaluation of results suggested that 5-mm elements showed best performance under physically reachable MR imaging parameters.

  14. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  15. Cost-effectiveness acceptability curves and a reluctance to lose.

    PubMed

    Severens, Johan L; Brunenberg, Daniëlle E M; Fenwick, Elisabeth A L; O'Brien, Bernie; Joore, Manuela A

    2005-01-01

    Cost-effectiveness acceptability curves (CEACs) are a method used to present uncertainty surrounding incremental cost-effectiveness ratios (ICERs). Construction of the curves relies on the assumption that the willingness to pay (WTP) for health gain is identical to the willingness to accept (WTA) health loss. The objective of this paper is to explore the impact that differences between WTP and WTA health changes have on CEACs. Previous empirical evidence has shown that the relationship between WTP and WTA is not 1:1. The discrepancy between WTP and WTA for health changes can be expressed as a ratio: the accept/reject ratio (which can vary between 1 and infinity). Depending on this ratio, the area within the southwest quadrant of the cost-effectiveness plane in which any bootstrap cost-effect pairs will be considered to be cost effective will be smaller, resulting in a lower CEAC. We used data from two clinical trials to illustrate that relaxing the 1:1 WTP/WTA assumption has an impact on the CEACs. Given the difficulty in assessing the accept/reject ratio for every evaluation, we suggest presenting a series of CEACs for a range of values for the accept/reject ratio, including 1 and infinite. Although it is not possible to explain this phenomenon within the extra-welfarist framework, it has been shown empirically that individuals give a higher valuation to the removal of effective therapies than to the introduction of new therapies that are more costly and effective. In cost-effectiveness analyses where uncertainty of the ICER covers the southwest quadrant of the cost-effectiveness plane, the discrepancy between societies' WTP and WTA should be indicated by drawing multiple CEACs.

  16. Minimum and Maximum Times Required to Obtain Representative Suspended Sediment Samples

    NASA Astrophysics Data System (ADS)

    Gitto, A.; Venditti, J. G.; Kostaschuk, R.; Church, M. A.

    2014-12-01

    Bottle sampling is a convenient method of obtaining suspended sediment measurements for the development of sediment budgets. While these methods are generally considered to be reliable, recent analysis of depth-integrated sampling has identified considerable uncertainty in measurements of grain-size concentration between grain-size classes of multiple samples. Point-integrated bottle sampling is assumed to represent the mean concentration of suspended sediment but the uncertainty surrounding this method is not well understood. Here we examine at-a-point variability in velocity, suspended sediment concentration, grain-size distribution, and grain-size moments to determine if traditional point-integrated methods provide a representative sample of suspended sediment. We present continuous hour-long observations of suspended sediment from the sand-bedded portion of the Fraser River at Mission, British Columbia, Canada, using a LISST laser-diffraction instrument. Spectral analysis suggests that there are no statistically significant peak in energy density, suggesting the absence of periodic fluctuations in flow and suspended sediment. However, a slope break in the spectra at 0.003 Hz corresponds to a period of 5.5 minutes. This coincides with the threshold between large-scale turbulent eddies that scale with channel width/mean velocity and hydraulic phenomena related to channel dynamics. This suggests that suspended sediment samples taken over a period longer than 5.5 minutes incorporate variability that is larger scale than turbulent phenomena in this channel. Examination of 5.5-minute periods of our time series indicate that ~20% of the time a stable mean value of volumetric concentration is reached within 30 seconds, a typical bottle sample duration. In ~12% of measurements a stable mean was not reached over the 5.5 minute sample duration. The remaining measurements achieve a stable mean in an even distribution over the intervening interval.

  17. A prospective study of the diagnostic utility of sputum Gram stain in pneumonia.

    PubMed

    Anevlavis, Stavros; Petroglou, Niki; Tzavaras, Athanasios; Maltezos, Efstratios; Pneumatikos, Ioannis; Froudarakis, Marios; Anevlavis, Eleftherios; Bouros, Demosthenes

    2009-08-01

    Sputum Gram stain and culture have been said to be unreliable indicators of the microbiological diagnosis of bacterial pneumonia. The etiological diagnosis of pneumonia is surrounded by great degree of uncertainty. This uncertainty should be and can be calculated and incorporated in the diagnosis and treatment. To determine the diagnostic accuracy and diagnostic value of sputum Gram stain in etiological diagnosis and initial selection of antimicrobial therapy of bacterial community acquired pneumonia (CAP). DESIGN-METHOD: Prospective study of 1390 patients with CAP admitted January 2002-June 2008, to our institutions. Of the 1390 patients, 178 (12.8%) fulfilled the criteria for inclusion into this study (good-quality sputa and presence of the same microorganism in blood and sputum cultures which was used as gold standard for assessing the diagnostic accuracy and diagnostic value of sputum Gram stain). The sensitivity of sputum Gram stain was 0.82 for Pneumococcal pneumonia, 0.76 for Staphylococcal pneumonia, 0.79 for Haemophilus influenzae pneumonia and 0.78 for Gram-negative bacilli pneumonia. The specificity of sputum Gram stain was 0.93 for Pneumococcal pneumonia, 0.96 for Staphylococcal pneumonia, 0.96 for H. influenzae pneumonia and 0.95 for Gram-negative bacilli pneumonia. The positive likelihood ratio (LR+) was 11.58 for Pneumococcal pneumonia, 19.38 for Staphylococcal pneumonia, 16.84 for H. influenzae pneumonia, 14.26 for Gram-negative bacilli pneumonia. The negative likelihood ratio (LR-) was 0.20 for Pneumococcal pneumonia, 0.25 for Staphylococcal pneumonia, 0.22 for H. influenzae pneumonia, and 0.23 for Gram-negative bacilli pneumonia. Sputum Gram stain is a dependable diagnostic test for the early etiological diagnosis of bacterial CAP that helps in choosing orthological and appropriate initial antimicrobial therapy.

  18. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  19. SU-F-BRD-09: Is It Sufficient to Use Only Low Density Tissue-Margin to Compensate Inter-Fractionation Setup Uncertainties in Lung Treatment?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, K; Yue, N; Chen, T

    2014-06-15

    Purpose: In lung radiation treatment, PTV is formed with a margin around GTV (or CTV/ITV). Although GTV is most likely of water equivalent density, the PTV margin may be formed with the surrounding low-density tissues, which may lead to unreal dosimetric plan. This study is to evaluate whether the concern of dose calculation inside the PTV with only low density margin could be justified in lung treatment. Methods: Three SBRT cases were analyzed. The PTV from the original plan (Plan-O) was created with a 5–10 mm margin outside the ITV to incorporate setup errors and all mobility from 10 respiratorymore » phases. Test plans were generated with the GTV shifted to the PTV edge to simulate the extreme situations with maximum setup uncertainties. Two representative positions as the very posterior-superior (Plan-PS) and anterior-inferior (Plan-AI) edge were considered. The virtual GTV was assigned a density of 1.0 g.cm−3 and surrounding lung, including the PTV margin, was defined as 0.25 g.cm−3. Also, additional plan with a 1mm tissue-margin instead of full lung-margin was created to evaluate whether a composite-margin (Plan-Comp) has a better approximation for dose calculation. All plans were generated on the average CT using Analytical Anisotropic Algorithm with heterogeneity correction on and all planning parameters/monitor unites remained unchanged. DVH analyses were performed for comparisons. Results: Despite the non-static dose distribution, the high-dose region synchronized with tumor positions. This might due to scatter conditions as greater doses were absorbed in the solid-tumor than in the surrounding low-density lungtissue. However, it still showed missing target coverage in general. Certain level of composite-margin might give better approximation for the dosecalculation. Conclusion: Our exploratory results suggest that with the lungmargin only, the planning dose of PTV might overestimate the coverage of the target during treatment. The significance of this overestimation might warrant further investigation.« less

  20. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Bookending the Opportunity to Lower Wind’s LCOE by Reducing the Uncertainty Surrounding Annual Energy Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark

    Reducing the performance risk surrounding a wind project can potentially lead to a lower weighted-average cost of capital (WACC), and hence a lower levelized cost of energy (LCOE), through an advantageous shift in capital structure, and possibly also a reduction in the cost of capital. Specifically, a reduction in performance risk will move the 1-year P99 annual energy production (AEP) estimate closer to the P50 AEP estimate, which in turn reduces the minimum debt service coverage ratio (DSCR) required by lenders, thereby allowing the project to be financed with a greater proportion of low-cost debt. In addition, a reduction inmore » performance risk might also reduce the cost of one or more of the three sources of capital that are commonly used to finance wind projects: sponsor or cash equity, tax equity, and/or debt. Preliminary internal LBNL analysis of the maximum possible LCOE reduction attainable from reducing the performance risk of a wind project found a potentially significant opportunity for LCOE reduction of ~$10/MWh, by reducing the P50 DSCR to its theoretical minimum value of 1.0 (Bolinger 2015b, 2014) and by reducing the cost of sponsor equity and debt by one-third to one-half each (Bolinger 2015a, 2015b). However, with FY17 funding from the U.S. Department of Energy’s Atmosphere to Electrons (A2e) Performance Risk, Uncertainty, and Finance (PRUF) initiative, LBNL has been revisiting this “bookending” exercise in more depth, and now believes that its earlier preliminary assessment of the LCOE reduction opportunity was overstated. This reassessment is based on two new-found understandings: (1) Due to ever-present and largely irreducible inter-annual variability (IAV) in the wind resource, the minimum required DSCR cannot possibly fall to 1.0 (on a P50 basis), and (2) A reduction in AEP uncertainty will not necessarily lead to a reduction in the cost of capital, meaning that a shift in capital structure is perhaps the best that can be expected (perhaps along with a modest decline in the cost of cash equity as new investors enter the market).« less

  2. Does the Limpopo River Basin have sufficient water for massive irrigation development in the plains of Mozambique?

    NASA Astrophysics Data System (ADS)

    van der Zaag, Pieter; Juizo, Dinis; Vilanculos, Agostinho; Bolding, Alex; Uiterweer, Nynke Post

    This paper verifies whether the water resources of the transboundary Limpopo River Basin are sufficient for the planned massive irrigation developments in the Mozambique part of this basin, namely 73,000 ha, in addition to existing irrigation (estimated at 9400 ha), and natural growth of common use irrigation (4000 ha). This development includes the expansion of sugar cane production for the production of ethanol as a biofuel. Total additional water requirements may amount to 1.3 × 10 9 m 3/a or more. A simple river basin simulation model was constructed in order to assess different irrigation development scenarios, and at two storage capacities of the existing Massingir dam. Many uncertainties surround current and future water availability in the Lower Limpopo River Basin. Discharge measurements are incomplete and sometimes inconsistent, while upstream developments during the last 25 years have been dramatic and future trends are unknown. In Mozambique it is not precisely known how much water is currently consumed, especially by the many small-scale users of surface and shallow alluvial groundwater. Future impacts of climate change increase existing uncertainties. Model simulations indicate that the Limpopo River does not carry sufficient water for all planned irrigation. A maximum of approx. 58,000 ha of irrigated agriculture can be sustained in the Mozambican part of the basin. This figure assumes that Massingir will be operated at increased reservoir capacity, and implies that only about 44,000 ha of new irrigation can be developed, which is 60% of the envisaged developments. Any additional water use would certainly impact downstream users and thus create tensions. Some time will elapse before 44,000 ha of new irrigated land will have been developed. This time could be used to improve monitoring networks to decrease current uncertainties. Meanwhile the four riparian Limpopo States are preparing a joint river basin study. In this study a methodology could be developed to estimate and safeguard water availability for those users who under the law do not need registration - but who do need water.

  3. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  4. Morphology and ionization of the interstellar cloud surrounding the solar system.

    PubMed

    Frisch, P C

    1994-09-02

    The first encounter between the sun and the surrounding interstellar cloud appears to have occurred 2000 to 8000 years ago. The sun and cloud space motions are nearly perpendicular, an indication that the sun is skimming the cloud surface. The electron density derived for the surrounding cloud from the carbon component of the anomalous cosmic ray population in the solar system and from the interstellar ratio of Mg(+) to Mg degrees toward Sirius support an equilibrium model for cloud ionization (an electron density of 0.22 to 0.44 per cubic centimeter). The upwind magnetic field direction is nearly parallel to the cloud surface. The relative sun-cloud motion indicates that the solar system has a bow shock.

  5. The Ebola virus disease outbreak and the mineral sectors of Guinea, Liberia, and Sierra Leone

    USGS Publications Warehouse

    Bermúdez-Lugo, Omayra; Menzie, William D.

    2015-01-01

    In response to the uncertainty surrounding the status of mineral projects in Guinea, Liberia, and Sierra Leone, the National Minerals Information Center compiled information on the distribution of mines, mineral facilities, and mineral projects under development in the three countries. This fact sheet provides information on the role that the mineral sector plays in their respective economies, on the operating status of mining projects through yearend 2014, and on the coordinated actions by mining companies to support governments and international relief organizations in their efforts to contain the EVD outbreak.

  6. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  7. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  8. "Maybe the Algae Was from the Filter": Maybe and Similar Modifiers as Mediational Tools and Indicators of Uncertainty and Possibility in Children's Science Talk

    ERIC Educational Resources Information Center

    Kirch, Susan A.; Siry, Christina A.

    2012-01-01

    Uncertainty is an essential component of scientific inquiry and it also permeates our daily lives. Understanding how to identify, evaluate, resolve and live in the presence of uncertainty is important for decision-making strategies and engaging in transformative actions. In contrast, confidence and certainty are prized in elementary school…

  9. Recognizing Uncertainty in the Q-Matrix via a Bayesian Extension of the DINA Model

    ERIC Educational Resources Information Center

    DeCarlo, Lawrence T.

    2012-01-01

    In the typical application of a cognitive diagnosis model, the Q-matrix, which reflects the theory with respect to the skills indicated by the items, is assumed to be known. However, the Q-matrix is usually determined by expert judgment, and so there can be uncertainty about some of its elements. Here it is shown that this uncertainty can be…

  10. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.

  11. Research on uncertainty evaluation measure and method of voltage sag severity

    NASA Astrophysics Data System (ADS)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  12. Urban green and grey space in relation to respiratory health in children.

    PubMed

    Tischer, Christina; Gascon, Mireia; Fernández-Somoano, Ana; Tardón, Adonina; Lertxundi Materola, Aitana; Ibarluzea, Jesus; Ferrero, Amparo; Estarlich, Marisa; Cirach, Marta; Vrijheid, Martine; Fuertes, Elaine; Dalmau-Bueno, Albert; Nieuwenhuijsen, Mark J; Antó, Josep M; Sunyer, Jordi; Dadvand, Payam

    2017-06-01

    We assessed the effect of three different indices of urban built environment on allergic and respiratory conditions.This study involved 2472 children participating in the ongoing INMA birth cohort located in two bio-geographic regions (Euro-Siberian and Mediterranean) in Spain. Residential surrounding built environment was characterised as 1) residential surrounding greenness based on satellite-derived normalised difference vegetation index (NDVI), 2) residential proximity to green spaces and 3) residential surrounding greyness based on urban land use patterns. Information on wheezing, bronchitis, asthma and allergic rhinitis up to age 4 years was obtained from parent-completed questionnaires. Logistic regression and generalised estimating equation modelling were performed.Among children from the Euro-Siberian region, higher residential surrounding greenness and higher proximity to green spaces were negatively associated with wheezing. In the Mediterranean region, higher residential proximity to green spaces was associated with a reduced risk for bronchitis. A higher amount of residential surrounding greyness was found to increase the risk for bronchitis in this region.Associations between indices of urban residential greenness and greyness with respiratory diseases differ by region. The pathways underlying these associations require further exploration. Copyright ©ERS 2017.

  13. Robust portfolio selection based on asymmetric measures of variability of stock returns

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  14. An Approach to Forecasting Health Expenditures, with Application to the U.S. Medicare System

    PubMed Central

    Lee, Ronald; Miller, Timothy

    2002-01-01

    Objective To quantify uncertainty in forecasts of health expenditures. Study Design Stochastic time series models are estimated for historical variations in fertility, mortality, and health spending per capita in the United States, and used to generate stochastic simulations of the growth of Medicare expenditures. Individual health spending is modeled to depend on the number of years until death. Data Sources/Study Setting A simple accounting model is developed for forecasting health expenditures, using the U.S. Medicare system as an example. Principal Findings Medicare expenditures are projected to rise from 2.2 percent of GDP (gross domestic product) to about 8 percent of GDP by 2075. This increase is due in equal measure to increasing health spending per beneficiary and to population aging. The traditional projection method constructs high, medium, and low scenarios to assess uncertainty, an approach that has many problems. Using stochastic forecasting, we find a 95 percent probability that Medicare spending in 2075 will fall between 4 percent and 18 percent of GDP, indicating a wide band of uncertainty. Although there is substantial uncertainty about future mortality decline, it contributed little to uncertainty about future Medicare spending, since lower mortality both raises the number of elderly, tending to raise spending, and is associated with improved health of the elderly, tending to reduce spending. Uncertainty about fertility, by contrast, leads to great uncertainty about the future size of the labor force, and therefore adds importantly to uncertainty about the health-share of GDP. In the shorter term, the major source of uncertainty is health spending per capita. Conclusions History is a valuable guide for quantifying our uncertainty about future health expenditures. The probabilistic model we present has several advantages over the high–low scenario approach to forecasting. It indicates great uncertainty about future Medicare expenditures relative to GDP. PMID:12479501

  15. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  16. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE PAGES

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    2018-02-21

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  17. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  18. "Conflict management" and "conflict resolution" are not synonymous terms.

    PubMed

    Robbins, S P

    1978-01-01

    Robbins sees functional conflict as an absolute necessity within organizations and explicitly encourages it. He explains: "Survival can result only when an organization is able to adapt to constant changes in the environment. Adaption is possible only through change, and change is stimulated by conflict." Robbins cites evidence indicating that conflict can be related to increased productivity and that critical thinking encourages well-developed decisions. He admits, however, that not all conflicts are good for the organization. Their functional or dysfunctional nature is determined by the impact of the conflict on the objectives of the organization. The author identifies several factors underlying the need for conflict stimulation: (1) managers who are surrounded by "yes men"; (2) subordinates who are afraid to admit ignorance or uncertainty; (3) decision-makers' excessive concern about hurting the feelings of others; or (4) an environment where new ideas are slow in coming forth. He suggests techniques for stimulating conflict; manipulating the communication channels (i.e., repression of information); changing the organizational structure (i.e., changes in size or position); and altering personal behavior factors (i.e., role incongruence). Robbins stresses that the actual method to be used in either resolving or stimulating conflict must be appropriate to the situation.

  19. Radar stage uncertainty

    USGS Publications Warehouse

    Fulford, J.M.; Davies, W.J.

    2005-01-01

    The U.S. Geological Survey is investigating the performance of radars used for stage (or water-level) measurement. This paper presents a comparison of estimated uncertainties and data for radar water-level measurements with float, bubbler, and wire weight water-level measurements. The radar sensor was also temperature-tested in a laboratory. The uncertainty estimates indicate that radar measurements are more accurate than uncorrected pressure sensors at higher water stages, but are less accurate than pressure sensors at low stages. Field data at two sites indicate that radar sensors may have a small negative bias. Comparison of field radar measurements with wire weight measurements found that the radar tends to measure slightly lower values as stage increases. Copyright ASCE 2005.

  20. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  1. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  2. Uncertainty-accounting environmental policy and management of water systems.

    PubMed

    Baresel, Christian; Destouni, Georgia

    2007-05-15

    Environmental policies for water quality and ecosystem management do not commonly require explicit stochastic accounts of uncertainty and risk associated with the quantification and prediction of waterborne pollutant loads and abatement effects. In this study, we formulate and investigate a possible environmental policy that does require an explicit stochastic uncertainty account. We compare both the environmental and economic resource allocation performance of such an uncertainty-accounting environmental policy with that of deterministic, risk-prone and risk-averse environmental policies under a range of different hypothetical, yet still possible, scenarios. The comparison indicates that a stochastic uncertainty-accounting policy may perform better than deterministic policies over a range of different scenarios. Even in the absence of reliable site-specific data, reported literature values appear to be useful for such a stochastic account of uncertainty.

  3. Towards a Treatment for Intolerance of Uncertainty in Young People with Autism Spectrum Disorder: Development of the Coping with Uncertainty in Everyday Situations (CUES©) Programme

    ERIC Educational Resources Information Center

    Rodgers, Jacqui; Hodgson, Anna; Shields, Kerry; Wright, Catharine; Honey, Emma; Freeston, Mark

    2017-01-01

    Intolerance of uncertainty (IU) is indicated as an important transdiagnostic process variable in a range of anxiety disorders. Anxiety is very common in children with autism spectrum disorders (ASD). This study aimed to develop a parent group based manualised treatment programme for young people with ASD, which focused on IU. An eight session…

  4. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  5. Effects of Uncertainty on ERPs to Emotional Pictures Depend on Emotional Valence

    PubMed Central

    Lin, Huiyan; Jin, Hua; Liang, Jiafeng; Yin, Ruru; Liu, Ting; Wang, Yiwen

    2015-01-01

    Uncertainty about the emotional content of an upcoming event has found to modulate neural activity to the event before its occurrence. However, it is still under debate whether the uncertainty effects occur after the occurrence of the event. To address this issue, participants were asked to view emotional pictures that were shortly after a cue, which either indicated a certain emotion of the picture or not. Both certain and uncertain cues were used by neutral symbols. The anticipatory phase (i.e., inter-trial interval, ITI) between the cue and the picture was short to enhance the effects of uncertainty. In addition, we used positive and negative pictures that differed only in valence but not in arousal to investigate whether the uncertainty effect was dependent on emotional valence. Electroencephalography (EEG) was recorded during the presentation of the pictures. Event-related potential (ERP) results showed that negative pictures evoked smaller P2 and late LPP but larger N2 in the uncertain as compared to the certain condition; whereas we did not find the uncertainty effect in early LPP. For positive pictures, the early LPP was larger in the uncertain as compared to the certain condition; however, there were no uncertainty effects in some other ERP components (e.g., P2, N2, and late LPP). The findings suggest that uncertainty modulates neural activity to emotional pictures and this modulation is altered by the valence of the pictures, indicating that individuals alter the allocation of attentional resources toward uncertain emotional pictures dependently on the valence of the pictures. PMID:26733916

  6. Inequality, green spaces, and pregnant women: roles of ethnicity and individual and neighbourhood socioeconomic status.

    PubMed

    Dadvand, Payam; Wright, John; Martinez, David; Basagaña, Xavier; McEachan, Rosemary R C; Cirach, Marta; Gidlow, Christopher J; de Hoogh, Kees; Gražulevičienė, Regina; Nieuwenhuijsen, Mark J

    2014-10-01

    Evidence of the impact of green spaces on pregnancy outcomes is limited with no report on how this impact might vary by ethnicity. We investigated the association between residential surrounding greenness and proximity to green spaces and birth weight and explored the modification of this association by ethnicity and indicators of individual (maternal education) and neighbourhood (Index of Multiple Deprivation) socioeconomic status. Our study was based on 10,780 singleton live-births from the Born in Bradford cohort, UK (2007-2010). We defined residential surrounding greenness as average of satellite-based Normalized Difference Vegetation Index (NDVI) in buffers of 50 m, 100 m, 250 m, 500 m and 1000 m around each maternal home address. Residential proximity to green spaces was defined as living within 300 m of a green space with an area of ≥ 5000 m(2). We utilized mixed effects models to estimate adjusted change in birth weight associated with residential surrounding greenness as well as proximity to green spaces. We found a positive association between birth weight and residential surrounding greenness. Furthermore, we observed an interaction between ethnicity and residential surrounding greenness in that for White British participants there was a positive association between birth weight and residential surrounding greenness whereas for participants of Pakistani origin there was no such an association. For surrounding greenness in larger buffers (500 m and 1000 m) there were some indications of stronger associations for participants with lower education and those living in more deprived neighbourhoods which were not replicated for surrounding greenness in smaller buffer sizes (i.e. 50 m, 100 m, and 250 m). The findings for residential proximity to a green space were not conclusive. Our study showed that residential surrounding greenness is associated with better foetal growth and this association could vary between different ethnic and socioeconomic groups. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Considerations in evaluating potential socioeconomic impacts of offshore platform decommissioning in California.

    PubMed

    Kruse, Sarah A; Bernstein, Brock; Scholz, Astrid J

    2015-10-01

    The 27 oil and gas platforms offshore southern California will eventually reach the end of their useful lifetimes (estimated between 2015 and 2030) and will be decommissioned. Current state and federal laws and regulations allow for alternative uses in lieu of the complete removal required in existing leases. Any decommissioning pathway will create a complex mix of costs, benefits, opportunities, and constraints for multiple user groups. To assist the California Natural Resources Agency in understanding these issues, we evaluated the potential socioeconomic impacts of the 2 most likely options: complete removal and partial removal of the structure to 85 feet below the waterline with the remaining structure left in place as an artificial reef-generally defined as a manmade structure with some properties that mimic a natural reef. We estimated impacts on commercial fishing, commercial shipping, recreational fishing, nonconsumptive boating, and nonconsumptive SCUBA diving. Available data supported quantitative estimates for some impacts, semiquantitative estimates for others, and only qualitative approximations of the direction of impact for still others. Even qualitative estimates of the direction of impacts and of user groups' likely preferred options have been useful to the public and decision makers and provided valuable input to the project's integrative decision model. Uncertainty surrounds even qualitative estimates of the likely direction of impact where interactions between multiple impacts could occur or where user groups include subsets that would experience the same option differently. In addition, we were unable to quantify effects on ecosystem value and on the larger regional ecosystem, because of data gaps on the population sizes and dynamics of key species and the uncertainty surrounding the contribution of platforms to available hard substrate and related natural populations offshore southern California. © 2015 SETAC.

  8. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  9. How well does your model capture the terrestrial ecosystem dynamics of the Arctic-Boreal Region?

    NASA Astrophysics Data System (ADS)

    Stofferahn, E.; Fisher, J. B.; Hayes, D. J.; Huntzinger, D. N.; Schwalm, C.

    2016-12-01

    The Arctic-Boreal Region (ABR) is a major source of uncertainties for terrestrial biosphere model (TBM) simulations. These uncertainties are precipitated by a lack of observational data from the region, affecting the parameterizations of cold environment processes in the models. Addressing these uncertainties requires a coordinated effort of data collection and integration of the following key indicators of the ABR ecosystem: disturbance, flora / fauna and related ecosystem function, carbon pools and biogeochemistry, permafrost, and hydrology. We are developing a model-data integration framework for NASA's Arctic Boreal Vulnerability Experiment (ABoVE), wherein data collection for the key ABoVE indicators is driven by matching observations and model outputs to the ABoVE indicators. The data are used as reference datasets for a benchmarking system which evaluates TBM performance with respect to ABR processes. The benchmarking system utilizes performance metrics to identify intra-model and inter-model strengths and weaknesses, which in turn provides guidance to model development teams for reducing uncertainties in TBM simulations of the ABR. The system is directly connected to the International Land Model Benchmarking (ILaMB) system, as an ABR-focused application.

  10. Combining Satellite Ocean Color and Hydrodynamic Model Uncertainties in Bio-Optical Forecasts

    DTIC Science & Technology

    2014-04-03

    observed chlorophyll distribution for that day (MODIS Image for October 17, 2011), without regard to sign, I.e., IFigs. 11(c)-11(a)l. Black pixels indicate...time using the current field from the model. Uncertainties in both the satellite chlorophyll values and the currents from the circulation model impact...ensemole techniques to partition the chlorophyll uncertainties into components due to atmospheric correction and bio-optical inversion. By combining

  11. Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens

    PubMed Central

    Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.

    2012-01-01

    This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915

  12. Evaluating uncertainty in predicting spatially variable representative elementary scales in fractured aquifers, with application to Turkey Creek Basin, Colorado

    USGS Publications Warehouse

    Wellman, Tristan P.; Poeter, Eileen P.

    2006-01-01

    Computational limitations and sparse field data often mandate use of continuum representation for modeling hydrologic processes in large‐scale fractured aquifers. Selecting appropriate element size is of primary importance because continuum approximation is not valid for all scales. The traditional approach is to select elements by identifying a single representative elementary scale (RES) for the region of interest. Recent advances indicate RES may be spatially variable, prompting unanswered questions regarding the ability of sparse data to spatially resolve continuum equivalents in fractured aquifers. We address this uncertainty of estimating RES using two techniques. In one technique we employ data‐conditioned realizations generated by sequential Gaussian simulation. For the other we develop a new approach using conditioned random walks and nonparametric bootstrapping (CRWN). We evaluate the effectiveness of each method under three fracture densities, three data sets, and two groups of RES analysis parameters. In sum, 18 separate RES analyses are evaluated, which indicate RES magnitudes may be reasonably bounded using uncertainty analysis, even for limited data sets and complex fracture structure. In addition, we conduct a field study to estimate RES magnitudes and resulting uncertainty for Turkey Creek Basin, a crystalline fractured rock aquifer located 30 km southwest of Denver, Colorado. Analyses indicate RES does not correlate to rock type or local relief in several instances but is generally lower within incised creek valleys and higher along mountain fronts. Results of this study suggest that (1) CRWN is an effective and computationally efficient method to estimate uncertainty, (2) RES predictions are well constrained using uncertainty analysis, and (3) for aquifers such as Turkey Creek Basin, spatial variability of RES is significant and complex.

  13. A Rotating Coil Apparatus with Sub-Micrometer Magnetic Center Measurement Stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Cherrill M.; Anderson, Scott, D.; Jensen, David R.

    2005-12-02

    A rotating double coil apparatus has been designed and built so that the relative magnetic center change of a quadrupole is measured to an uncertainty smaller than 0.02 micrometers (=micron, {micro}m) for a single measurement. Furthermore, repeated measurements over about an hour vary by less than 0.1 {micro}m and by less than 1 {micro}m for periods of 24 hrs or longer. Correlation analyses of long data runs show that the magnet center measurement is sensitive to mechanical effects, such as vibration and rotating part wear, as well as to environmental effects, such as temperature and relative humidity. Evolving apparatus designmore » has minimized mechanical noise and environmental isolation has reduced the effects of the surrounding environment so that sub-micron level measurement uncertainties and micron level stability have been achieved for multi-day measurement periods. Apparatus design evolution will be described in detail and correlation data taken on water-cooled electromagnet and adjustable permanent quadrupoles, which are about 350 mm in overall length, will be shown. These quads were prototypes for the linac quads of the Next Linear Collider (NLC) that had to meet the requirement that their magnetic centers change less than 1 micron during a 20% change in field strength. Thus it was necessary to develop an apparatus that could track the magnetic center with a fraction of a micron uncertainty.« less

  14. Validation of UARS MLS C10 Measurements

    NASA Technical Reports Server (NTRS)

    Waters, J.; Read, W.; Froideveaux, L.; Lungu, T.; Perun, V.; Stachnik, R.; Jarnot, R.; Cofield, R.; Fishbein, E.; Flower, D.; hide

    1994-01-01

    This paper describes the validation of stratospheric C10 measurements by the MLS on the UARS. The comparisons done to date between MLS and other measurements of C10 indicate general agreement to within the estimated MLS uncertainties and the uncertainties of the comparative measurements.

  15. Addressing potential local adaptation in species distribution models: implications for conservation under climate change

    USGS Publications Warehouse

    Hällfors, Maria Helena; Liao, Jishan; Dzurisin, Jason D. K.; Grundel, Ralph; Hyvärinen, Marko; Towle, Kevin; Wu, Grace C.; Hellmann, Jessica J.

    2016-01-01

    Species distribution models (SDMs) have been criticized for involving assumptions that ignore or categorize many ecologically relevant factors such as dispersal ability and biotic interactions. Another potential source of model error is the assumption that species are ecologically uniform in their climatic tolerances across their range. Typically, SDMs to treat a species as a single entity, although populations of many species differ due to local adaptation or other genetic differentiation. Not taking local adaptation into account, may lead to incorrect range prediction and therefore misplaced conservation efforts. A constraint is that we often do not know the degree to which populations are locally adapted, however. Lacking experimental evidence, we still can evaluate niche differentiation within a species' range to promote better conservation decisions. We explore possible conservation implications of making type I or type II errors in this context. For each of two species, we construct three separate MaxEnt models, one considering the species as a single population and two of disjunct populations. PCA analyses and response curves indicate different climate characteristics in the current environments of the populations. Model projections into future climates indicate minimal overlap between areas predicted to be climatically suitable by the whole species versus population-based models. We present a workflow for addressing uncertainty surrounding local adaptation in SDM application and illustrate the value of conducting population-based models to compare with whole-species models. These comparisons might result in more cautious management actions when alternative range outcomes are considered.

  16. High-resolution tephrochronology of the Wilson Creek Formation (Mono Lake, California) and Laschamp event using 238U-230Th SIMS dating of accessory mineral rims

    NASA Astrophysics Data System (ADS)

    Vazquez, Jorge A.; Lidzbarski, Marsha I.

    2012-12-01

    Sediments of the Wilson Creek Formation surrounding Mono Lake preserve a high-resolution archive of glacial and pluvial responses along the eastern Sierra Nevada due to late Pleistocene climate change. An absolute chronology for the Wilson Creek stratigraphy is critical for correlating the paleoclimate record to other archives in the western U.S. and the North Atlantic region. However, multiple attempts to date the Wilson Creek stratigraphy using carbonates and tephras yield discordant results due to open-system effects and radiocarbon reservoir uncertainties as well as abundant xenocrysts. New ion microprobe 238U-230Th dating of the final increments of crystallization recorded by allanite and zircon autocrysts from juvenile pyroclasts yield ages that effectively date eruption of key tephra beds and delimit the timing of basal Wilson Creek sedimentation to the interval between 26.8±2.1 and 61.7±1.9 ka. Tephra (Ash 15) erupted during the geomagnetic excursion originally designated the Mono Lake excursion yields an age of 40.8±1.9 ka, indicating that the event is instead the Laschamp excursion. The new ages support a depositional chronology from magnetostratigraphy that indicates quasi-synchronous glacial and hydrologic responses in the Sierra Nevada and Mono Basin to regional climate change, with intervals of lake filling and glacial-snowpack melting that are in phase with peaks in spring insolation.

  17. High-resolution tephrochronology of the Wilson Creek Formation (Mono Lake, California) and Laschamp event using 238U-230Th SIMS dating of accessory mineral rims

    USGS Publications Warehouse

    Vazquez, Jorge A.; Lidzbarski, Marsha I.

    2012-01-01

    Sediments of the Wilson Creek Formation surrounding Mono Lake preserve a high-resolution archive of glacial and pluvial responses along the eastern Sierra Nevada due to late Pleistocene climate change. An absolute chronology for the Wilson Creek stratigraphy is critical for correlating the paleoclimate record to other archives in the western U.S. and the North Atlantic region. However, multiple attempts to date the Wilson Creek stratigraphy using carbonates and tephras yield discordant results due to open-system effects and radiocarbon reservoir uncertainties as well as abundant xenocrysts. New ion microprobe 238U-230Th dating of the final increments of crystallization recorded by allanite and zircon autocrysts from juvenile pyroclasts yield ages that effectively date eruption of key tephra beds and delimit the timing of basal Wilson Creek sedimentation to the interval between 26.8±2.1 and 61.7±1.9 ka. Tephra (Ash 15) erupted during the geomagnetic excursion originally designated the Mono Lake excursion yields an age of 40.8±1.9 ka, indicating that the event is instead the Laschamp excursion. The new ages support a depositional chronology from magnetostratigraphy that indicates quasi-synchronous glacial and hydrologic responses in the Sierra Nevada and Mono Basin to regional climate change, with intervals of lake filling and glacial-snowpack melting that are in phase with peaks in spring insolation.

  18. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  20. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Grade and tonnage models and development of quantitative descriptive, economic, and deposit density models will help reduce the uncertainty of these new assessments.

  1. Quantum Probability -- A New Direction for Modeling in Cognitive Science

    NASA Astrophysics Data System (ADS)

    Roy, Sisir

    2014-07-01

    Human cognition is still a puzzling issue in research and its appropriate modeling. It depends on how the brain behaves at that particular instance and identifies and responds to a signal among myriads of noises that are present in the surroundings (called external noise) as well as in the neurons themselves (called internal noise). Thus it is not surprising to assume that the functionality consists of various uncertainties, possibly a mixture of aleatory and epistemic uncertainties. It is also possible that a complicated pathway consisting of both types of uncertainties in continuum play a major role in human cognition. For more than 200 years mathematicians and philosophers have been using probability theory to describe human cognition. Recently in several experiments with human subjects, violation of traditional probability theory has been clearly revealed in plenty of cases. Literature survey clearly suggests that classical probability theory fails to model human cognition beyond a certain limit. While the Bayesian approach may seem to be a promising candidate to this problem, the complete success story of Bayesian methodology is yet to be written. The major problem seems to be the presence of epistemic uncertainty and its effect on cognition at any given time. Moreover the stochasticity in the model arises due to the unknown path or trajectory (definite state of mind at each time point), a person is following. To this end a generalized version of probability theory borrowing ideas from quantum mechanics may be a plausible approach. A superposition state in quantum theory permits a person to be in an indefinite state at each point of time. Such an indefinite state allows all the states to have the potential to be expressed at each moment. Thus a superposition state appears to be able to represent better, the uncertainty, ambiguity or conflict experienced by a person at any moment demonstrating that mental states follow quantum mechanics during perception and cognition of ambiguous figures.

  2. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE PAGES

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.; ...

    2014-12-09

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  3. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lark, Murray

    2015-04-01

    At BGS, expert elicitation has been used to evaluate uncertainty of surveyed boundaries in several, common, geological scenarios. As a result, a 'collective' understanding of the issues surrounding each scenario has emerged. The work has provoked wider debate in three key areas: a) what can we do to resolve those scenarios where a 'consensus' of understanding cannot be achieved b) what does it mean for survey practices and subsequent use of maps in 3D models c) how do we communicate the 'collective' understanding of geological mapping (with or without consensus for specific scenarios). Previous work elicited expert judgement for uncertainty in six contrasting mapping scenarios. In five cases it was possible to arrive at a consensus model; in a sixth case experts with different experience (length of service, academic background) took very different views of the nature of the mapping problem. The scenario concerned identification of the boundary between two contrasting tills (one derived from Triassic source materials being red in colour; the other, derived from Jurassic materials being grey in colour). Initial debate during the elicitation identified that the colour contrast should provide some degree of confidence in locating the boundary via traditional auger-traverse survey methods. However, as the elicitation progressed, it became clear that the complexities of the relationship between the two Tills were not uniformly understood across the experts and the panel could not agree a consensus regarding the spatial uncertainty of the boundary. The elicitation process allowed a significant degree of structured knowledge-exchange between experts of differing backgrounds and was successful in identifying a measure of uncertainty for what was considered a contentious scenario. However, the findings have significant implications for a boundary-scenario that is widely mapped across the central regions of Great Britain. We will discuss our experience of the use of elicitation methodology and the implications of our results for further work at the BGS to quantify uncertainty in 2d and 3d products. In particular we will consider the impacts of surveyor 'experience' in how the elicitation process works.

  4. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  5. First results in terrain mapping for a roving planetary explorer

    NASA Technical Reports Server (NTRS)

    Krotkov, E.; Caillas, C.; Hebert, M.; Kweon, I. S.; Kanade, Takeo

    1989-01-01

    To perform planetary exploration without human supervision, a complete autonomous rover must be able to model its environment while exploring its surroundings. Researchers present a new algorithm to construct a geometric terrain representation from a single range image. The form of the representation is an elevation map that includes uncertainty, unknown areas, and local features. By virtue of working in spherical-polar space, the algorithm is independent of the desired map resolution and the orientation of the sensor, unlike other algorithms that work in Cartesian space. They also describe new methods to evaluate regions of the constructed elevation maps to support legged locomotion over rough terrain.

  6. Surveillance and medical therapy following endovascular treatment of chronic cerebrospinal venous insufficiency.

    PubMed

    Forbes, Thomas L; Harris, Jeremy R; Kribs, Stewart W

    2012-06-01

    The debate regarding the possible link between chronic cerebrospinal venous insufficiency and multiple sclerosis (MS) is continuously becoming more and more contentious due to the current lack of level 1 evidence from randomized trials. Regardless of this continued uncertainty surrounding the safety and efficacy of this therapy, MS patients from Canada, and other jurisdictions, are traveling abroad to receive central venous angioplasty and, unfortunately, some also receive venous stents. They often return home with few instructions regarding follow-up or medical therapy. In response we propose some interim, practical recommendations for post-procedural surveillance and medical therapy, until further information is available.

  7. In the name of science: don't tamper with the deceptive truth...

    PubMed

    Reis, Helton J; Mukhamedyarov, Marat A; Rizvanov, Albert A; Palotás, András

    2009-12-01

    Werner Heisenberg (1901-1976) is one of the most controversial, most ambivalent and most important figures in the history of modern science. The debate surrounding him with respect to nuclear weapons and National Socialism appears unending. Even though Heisenberg's uncertainty principle of the quantum system and his involvement in the Nazi atomic bomb project have been thoroughly discussed in various journals over the past decades, no communication has ever been published at a holistic level of his greatest Nobel-prize winning achievement in theoretical physics. In order to fill up this hole, this piece explicitly communicates the Heisenberg's paradox at all levels of science.

  8. Carpal tunnel syndrome: the role of occupational factors.

    PubMed

    Palmer, Keith T

    2011-02-01

    Carpal tunnel syndrome (CTS) is a fairly common condition in working-aged people, sometimes caused by physical occupational activities, such as repeated and forceful movements of the hand and wrist or use of hand-held, powered, vibratory tools. Symptoms may be prevented or alleviated by primary control measures at work, and some cases of disease are compensable. Following a general description of the disorder, its epidemiology and some of the difficulties surrounding diagnosis, this review focusses on the role of occupational factors in causation of CTS and factors that can mitigate risk. Areas of uncertainty, debate and research interest are emphasised where relevant. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Orbital Debris Shape and Orientation Effects on Ballistic Limits

    NASA Technical Reports Server (NTRS)

    Evans, Steven W.; Williamsen, Joel E.

    2005-01-01

    The SPHC hydrodynamic code was used to evaluate the effects of orbital debris particle shape and orientation on penetration of a typical spacecraft dual-wall shield. Impacts were simulated at near-normal obliquity at 12 km/sec. Debris cloud characteristics and damage potential are compared with those from impacts by spherical projectiles. Results of these simulations indicate the uncertainties in the predicted ballistic limits due to modeling uncertainty and to uncertainty in the impactor orientation.

  10. When Advisors' True Intentions Are in Question. How Do Bank Customers Cope with Uncertainty in Financial Consultancies?

    PubMed

    Mackinger, Barbara; Jonas, Eva; Mühlberger, Christina

    2017-01-01

    When making financial decisions bank customers are confronted with two types of uncertainty: first, return on investments is uncertain and there is a risk of losing money. Second, customers cannot be certain about their financial advisor's true intentions. This might decrease customers' willingness to cooperate with advisors. However, the uncertainty management model and fairness heuristic theory predict that in uncertain situations customers are willing to cooperate with financial advisors when they perceive fairness. In the current study, we investigated how perceived fairness in the twofold uncertain situations increased people's intended future cooperation with an advisor. We asked customers of financial consultancies about their experienced uncertainty regarding both the investment decision and the advisor's intentions. Moreover, we asked them about their perceived fairness, as well as their intention to cooperate with the advisor in the future. A three-way moderation analysis showed that customers who faced high uncertainty regarding the investment decision and high uncertainty regarding the advisor's true intentions indicated the lowest intended cooperation with the advisor but high fairness increased their cooperation. Interestingly, when people were only uncertain about the advisor's intentions (but certain about the decision) they indicated less cooperation than when they were only uncertain about the decision (but certain about the advisor's intentions). A mediated moderation analysis revealed that this relationship was explained by customers' lower trust in their advisors.

  11. Uncertainties in models of tropospheric ozone based on Monte Carlo analysis: Tropospheric ozone burdens, atmospheric lifetimes and surface distributions

    NASA Astrophysics Data System (ADS)

    Derwent, Richard G.; Parrish, David D.; Galbally, Ian E.; Stevenson, David S.; Doherty, Ruth M.; Naik, Vaishali; Young, Paul J.

    2018-05-01

    Recognising that global tropospheric ozone models have many uncertain input parameters, an attempt has been made to employ Monte Carlo sampling to quantify the uncertainties in model output that arise from global tropospheric ozone precursor emissions and from ozone production and destruction in a global Lagrangian chemistry-transport model. Ninety eight quasi-randomly Monte Carlo sampled model runs were completed and the uncertainties were quantified in tropospheric burdens and lifetimes of ozone, carbon monoxide and methane, together with the surface distribution and seasonal cycle in ozone. The results have shown a satisfactory degree of convergence and provide a first estimate of the likely uncertainties in tropospheric ozone model outputs. There are likely to be diminishing returns in carrying out many more Monte Carlo runs in order to refine further these outputs. Uncertainties due to model formulation were separately addressed using the results from 14 Atmospheric Chemistry Coupled Climate Model Intercomparison Project (ACCMIP) chemistry-climate models. The 95% confidence ranges surrounding the ACCMIP model burdens and lifetimes for ozone, carbon monoxide and methane were somewhat smaller than for the Monte Carlo estimates. This reflected the situation where the ACCMIP models used harmonised emissions data and differed only in their meteorological data and model formulations whereas a conscious effort was made to describe the uncertainties in the ozone precursor emissions and in the kinetic and photochemical data in the Monte Carlo runs. Attention was focussed on the model predictions of the ozone seasonal cycles at three marine boundary layer stations: Mace Head, Ireland, Trinidad Head, California and Cape Grim, Tasmania. Despite comprehensively addressing the uncertainties due to global emissions and ozone sources and sinks, none of the Monte Carlo runs were able to generate seasonal cycles that matched the observations at all three MBL stations. Although the observed seasonal cycles were found to fall within the confidence limits of the ACCMIP members, this was because the model seasonal cycles spanned extremely wide ranges and there was no single ACCMIP member that performed best for each station. Further work is required to examine the parameterisation of convective mixing in the models to see if this erodes the isolation of the marine boundary layer from the free troposphere and thus hides the models' real ability to reproduce ozone seasonal cycles over marine stations.

  12. Estimating the risks for adverse effects of total phosphorus in receiving streams with the Stochastic Empirical Loading and Dilution Model (SELDM)

    USGS Publications Warehouse

    Granato, Gregory E.; Jones, Susan C.

    2015-01-01

    Results of this study indicate the potential benefits of the multi-decade simulations that SELDM provides because these simulations quantify risks and uncertainties that affect decisions made with available data and statistics. Results of the SELDM simulations indicate that the WQABI criteria concentrations may be too stringent for evaluating the stormwater quality in receiving streams, highway runoff, and BMP discharges; especially with the substantial uncertainties inherent in selecting representative data.

  13. Adaptive management for soil ecosystem services.

    PubMed

    Birgé, Hannah E; Bevans, Rebecca A; Allen, Craig R; Angeler, David G; Baer, Sara G; Wall, Diana H

    2016-12-01

    Ecosystem services provided by soil include regulation of the atmosphere and climate, primary (including agricultural) production, waste processing, decomposition, nutrient conservation, water purification, erosion control, medical resources, pest control, and disease mitigation. The simultaneous production of these multiple services arises from complex interactions among diverse aboveground and belowground communities across multiple scales. When a system is mismanaged, non-linear and persistent losses in ecosystem services can arise. Adaptive management is an approach to management designed to reduce uncertainty as management proceeds. By developing alternative hypotheses, testing these hypotheses and adjusting management in response to outcomes, managers can probe dynamic mechanistic relationships among aboveground and belowground soil system components. In doing so, soil ecosystem services can be preserved and critical ecological thresholds avoided. Here, we present an adaptive management framework designed to reduce uncertainty surrounding the soil system, even when soil ecosystem services production is not the explicit management objective, so that managers can reach their management goals without undermining soil multifunctionality or contributing to an irreversible loss of soil ecosystem services. Copyright © 2016. Published by Elsevier Ltd.

  14. A comment on “temporal variation in survival and recovery rates of lesser scaup”

    USGS Publications Warehouse

    Lindberg, Mark S.; Boomer, G. Scott; Schmutz, Joel A.; Walker, Johann A.

    2017-01-01

    Concerns about declines in the abundance of lesser scaup (Aythya affinis) have promoted a number of analyses to understand reasons for this decline. Unfortunately, most of these analyses, including that of Arnold et al. (2016 Journal of Wildlife Management 80: 850–861), are based on observational studies leading to weak inference. Although we commend the efforts of Arnold et al. (2016 Journal of Wildlife Management 80: 850–861), we think their conclusions are over-stated given their retrospective analysis. Further, we note a number of inconsistencies in their reasoning and offer alternative conclusions that can be drawn from their analysis. Given the uncertainty still surrounding management of lesser scaup, we do not believe it is prudent to abandon or greatly modify adaptive management approaches designed specifically to make optimal decisions in the face of uncertainty. The current learning-based and recursive approach to management appears to be providing adequate guidance for harvest without punctuated changes to harvest levels, as Arnold et al. (2016 Journal of Wildlife Management 80: 850–861) recommend.

  15. New drug regulations in France: what are the impacts on market access? Part 2 – impacts on market access and impacts for the pharmaceutical industry

    PubMed Central

    Rémuzat, Cécile; Toumi, Mondher; Falissard, Bruno

    2013-01-01

    Access to the French drug market is being impacted by an ongoing dramatic shift in practice as well as by two laws that came into force in December 2011. This new environment has been described and analyzed in two separate articles. This second article analyzes how this new environment will actually impact the access to French drug market. French drug market access will be increasingly driven by comparative-effectiveness and cost-effectiveness data, and an increased role of postmarketing studies in the years to come. This access is evolving in a more complex environment for stakeholders due to the uncertainties surrounding these changes and it will be more complex and difficult for the pharmaceutical industry to address. The main issue faced by the pharmaceutical companies will be to minimize uncertainty at the time of a drug's launch to narrow the decision window. This is a major change of paradigm for the pharmaceutical business, in which pre- and postlaunch risks are directed toward the pharmaceutical industry. PMID:27226829

  16. Effects of model structural uncertainty on carbon cycle projections: biological nitrogen fixation as a case study

    NASA Astrophysics Data System (ADS)

    Wieder, William R.; Cleveland, Cory C.; Lawrence, David M.; Bonan, Gordon B.

    2015-04-01

    Uncertainties in terrestrial carbon (C) cycle projections increase uncertainty of potential climate feedbacks. Efforts to improve model performance often include increased representation of biogeochemical processes, such as coupled carbon-nitrogen (N) cycles. In doing so, models are becoming more complex, generating structural uncertainties in model form that reflect incomplete knowledge of how to represent underlying processes. Here, we explore structural uncertainties associated with biological nitrogen fixation (BNF) and quantify their effects on C cycle projections. We find that alternative plausible structures to represent BNF result in nearly equivalent terrestrial C fluxes and pools through the twentieth century, but the strength of the terrestrial C sink varies by nearly a third (50 Pg C) by the end of the twenty-first century under a business-as-usual climate change scenario representative concentration pathway 8.5. These results indicate that actual uncertainty in future C cycle projections may be larger than previously estimated, and this uncertainty will limit C cycle projections until model structures can be evaluated and refined.

  17. Effects of Parameter Uncertainty on Long-Term Simulations of Lake Alkalinity

    NASA Astrophysics Data System (ADS)

    Lee, Sijin; Georgakakos, Konstantine P.; Schnoor, Jerald L.

    1990-03-01

    A first-order second-moment uncertainty analysis has been applied to two lakes in the Adirondack Park, New York, to assess the long-term response of lakes to acid deposition. Uncertainty due to parameter error and initial condition error was considered. Because the enhanced trickle-down (ETD) model is calibrated with only 3 years of field data and is used to simulate a 50-year period, the uncertainty in the lake alkalinity prediction is relatively large. When a best estimate of parameter uncertainty is used, the annual average alkalinity is predicted to be -11 ±28 μeq/L for Lake Woods and 142 ± 139 μeq/L for Lake Panther after 50 years. Hydrologic parameters and chemical weathering rate constants contributed most to the uncertainty of the simulations. Results indicate that the uncertainty in long-range predictions of lake alkalinity increased significantly over a 5- to 10-year period and then reached a steady state.

  18. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  19. Examining Dark Triad traits in relation to sleep disturbances, anxiety sensitivity and intolerance of uncertainty in young adults.

    PubMed

    Sabouri, Sarah; Gerber, Markus; Lemola, Sakari; Becker, Stephen P; Shamsi, Mahin; Shakouri, Zeinab; Sadeghi Bahmani, Dena; Kalak, Nadeem; Holsboer-Trachsler, Edith; Brand, Serge

    2016-07-01

    The Dark Triad (DT) describes a set of three closely related personality traits, Machiavellianism, narcissism, and psychopathy. The aim of this study was to examine the associations between DT traits, sleep disturbances, anxiety sensitivity and intolerance of uncertainty. A total of 341 adults (M=29years) completed a series of questionnaires related to the DT traits, sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. A higher DT total score was associated with increased sleep disturbances, and higher scores for anxiety sensitivity and intolerance of uncertainty. In regression analyses Machiavellianism and psychopathy were predictors of sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. Results indicate that specific DT traits, namely Machiavellianism and psychopathy, are associated with sleep disturbances, anxiety sensitivity and intolerance of uncertainty in young adults. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Effects of directional uncertainty on visually-guided joystick pointing.

    PubMed

    Berryhill, Marian; Kveraga, Kestutis; Hughes, Howard C

    2005-02-01

    Reaction times generally follow the predictions of Hick's law as stimulus-response uncertainty increases, although notable exceptions include the oculomotor system. Saccadic and smooth pursuit eye movement reaction times are independent of stimulus-response uncertainty. Previous research showed that joystick pointing to targets, a motor analog of saccadic eye movements, is only modestly affected by increased stimulus-response uncertainty; however, a no-uncertainty condition (simple reaction time to 1 possible target) was not included. Here, we re-evaluate manual joystick pointing including a no-uncertainty condition. Analysis indicated simple joystick pointing reaction times were significantly faster than choice reaction times. Choice reaction times (2, 4, or 8 possible target locations) only slightly increased as the number of possible targets increased. These data suggest that, as with joystick tracking (a motor analog of smooth pursuit eye movements), joystick pointing is more closely approximated by a simple/choice step function than the log function predicted by Hick's law.

  1. Quantifying uncertainties in precipitation measurement

    NASA Astrophysics Data System (ADS)

    Chen, H. Z. D.

    2017-12-01

    The scientific community have a long history of utilizing precipitation data for climate model design. However, precipitation record and its model contains more uncertainty than its temperature counterpart. Literature research have shown precipitation measurements to be highly influenced by its surrounding environment, and weather stations are traditionally situated in open areas and subject to various limitations. As a result, this restriction limits the ability of the scientific community to fully close the loop on the water cycle. Horizontal redistribution have been shown to be a major factor influencing precipitation measurements. Efforts have been placed on reducing its effect on the monitoring apparatus. However, the amount of factors contributing to this uncertainty is numerous and difficult to fully capture. As a result, noise factor remains high in precipitation data. This study aims to quantify all uncertainties in precipitation data by factoring out horizontal redistribution by measuring them directly. Horizontal contribution of precipitation will be quantified by measuring precipitation at different heights, with one directly shadowing the other. The above collection represents traditional precipitation data, whereas the bottom measurements sums up the overall error term at given location. Measurements will be recorded and correlated with nearest available wind measurements to quantify its impact on traditional precipitation record. Collections at different locations will also be compared to see whether this phenomenon is location specific or if a general trend can be derived. We aim to demonstrate a new way to isolate the noise component in traditional precipitation data via empirical measurements. By doing so, improve the overall quality of historic precipitation record. As a result, provide a more accurate information for the design and calibration of large scale climate modeling.

  2. Wheels-Off Time Uncertainty Impact on Benefits of Early Call for Release Scheduling

    NASA Technical Reports Server (NTRS)

    Palopo, Kee; Chatterji, Gano B.; Almog, Noam

    2017-01-01

    Arrival traffic scenarios with 808 flights from 173 airports to Houston George Bush International airport are simulated to determine if Call For Release flights can receive a benefit in terms of less delay over other flights by scheduling prior to gate pushback (look-ahead in time) as opposed to at gate pushback. Call for Release flights are departures that require approval from Air Route Traffic Control Center prior to release. Realism is brought to the study by including gate departure delay and taxi-out delay uncertainties for the 77 major U. S. airports. Gate departure delay uncertainty is assumed to increase as a function of look-ahead time. Results show that Call For Release flights from an airport within the freeze horizon (a region surrounding the arrival airport) can get an advantage over other flights to a capacity constrained airport by scheduling prior to gate pushback, provided the wheels-off time uncertainty with respect to schedule is controlled to a small value, such as within a three-minute window. Another finding of the study is that system delay, measured as the sum of arrival delays, is smaller when flights are scheduled in the order of arrival compared to in the order of departure. Because flights from airports within the freeze horizon are scheduled in the order of departure, an increase in the number of internal airports with a larger freeze horizon increases system delay. Delay in the given scenario was found to increase by 126% (from 13.8 hours to 31.2 hours) as freeze horizon was increased from 30-minutes to 2-hours in the baseline scenario.

  3. Usage of ensemble geothermal models to consider geological uncertainties

    NASA Astrophysics Data System (ADS)

    Rühaak, Wolfram; Steiner, Sarah; Welsch, Bastian; Sass, Ingo

    2015-04-01

    The usage of geothermal energy for instance by borehole heat exchangers (BHE) is a promising concept for a sustainable supply of heat for buildings. BHE are closed pipe systems, in which a fluid is circulating. Heat from the surrounding rocks is transferred to the fluid purely by conduction. The fluid carries the heat to the surface, where it can be utilized. Larger arrays of BHE require typically previous numerical models. Motivations are the design of the system (number and depth of the required BHE) but also regulatory reasons. Especially such regulatory operating permissions often require maximum realistic models. Although such realistic models are possible in many cases with today's codes and computer resources, they are often expensive in terms of time and effort. A particular problem is the knowledge about the accuracy of the achieved results. An issue, which is often neglected while dealing with highly complex models, is the quantification of parameter uncertainties as a consequence of the natural heterogeneity of the geological subsurface. Experience has shown, that these heterogeneities can lead to wrong forecasts. But also variations in the technical realization and especially of the operational parameters (which are mainly a consequence of the regional climate) can lead to strong variations in the simulation results. Instead of one very detailed single forecast model, it should be considered, to model numerous more simple models. By varying parameters, the presumed subsurface uncertainties, but also the uncertainties in the presumed operational parameters can be reflected. Finally not only one single result should be reported, but instead the range of possible solutions and their respective probabilities. In meteorology such an approach is well known as ensemble-modeling. The concept is demonstrated at a real world data set and discussed.

  4. Quantifying uncertainties in the structural response of SSME blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  5. The known unknowns: neural representation of second-order uncertainty, and ambiguity

    PubMed Central

    Bach, Dominik R.; Hulme, Oliver; Penny, William D.; Dolan, Raymond J.

    2011-01-01

    Predictions provided by action-outcome probabilities entail a degree of (first-order) uncertainty. However, these probabilities themselves can be imprecise and embody second-order uncertainty. Tracking second-order uncertainty is important for optimal decision making and reinforcement learning. Previous functional magnetic resonance imaging investigations of second-order uncertainty in humans have drawn on an economic concept of ambiguity, where action-outcome associations in a gamble are either known (unambiguous) or completely unknown (ambiguous). Here, we relaxed the constraints associated with a purely categorical concept of ambiguity and varied the second-order uncertainty of gambles continuously, quantified as entropy over second-order probabilities. We show that second-order uncertainty influences decisions in a pessimistic way by biasing second-order probabilities, and that second-order uncertainty is negatively correlated with posterior cingulate cortex activity. The category of ambiguous (compared to non-ambiguous) gambles also biased choice in a similar direction, but was associated with distinct activation of a posterior parietal cortical area; an activation that we show reflects a different computational mechanism. Our findings indicate that behavioural and neural responses to second-order uncertainty are distinct from those associated with ambiguity and may call for a reappraisal of previous data. PMID:21451019

  6. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  7. On the uncertainty of interdisciplinarity measurements due to incomplete bibliographic data.

    PubMed

    Calatrava Moreno, María Del Carmen; Auzinger, Thomas; Werthner, Hannes

    The accuracy of interdisciplinarity measurements is directly related to the quality of the underlying bibliographic data. Existing indicators of interdisciplinarity are not capable of reflecting the inaccuracies introduced by incorrect and incomplete records because correct and complete bibliographic data can rarely be obtained. This is the case for the Rao-Stirling index, which cannot handle references that are not categorized into disciplinary fields. We introduce a method that addresses this problem. It extends the Rao-Stirling index to acknowledge missing data by calculating its interval of uncertainty using computational optimization. The evaluation of our method indicates that the uncertainty interval is not only useful for estimating the inaccuracy of interdisciplinarity measurements, but it also delivers slightly more accurate aggregated interdisciplinarity measurements than the Rao-Stirling index.

  8. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  9. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  10. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  11. Water resources in the twenty-first century; a study of the implications of climate uncertainty

    USGS Publications Warehouse

    Moss, Marshall E.; Lins, Harry F.

    1989-01-01

    The interactions of the water resources on and within the surface of the Earth with the atmosphere that surrounds it are exceedingly complex. Increased uncertainty can be attached to the availability of water of usable quality in the 21st century, therefore, because of potential anthropogenic changes in the global climate system. For the U.S. Geological Survey to continue to fulfill its mission with respect to assessing the Nation's water resources, an expanded program to study the hydrologic implications of climate uncertainty will be required. The goal for this program is to develop knowledge and information concerning the potential water-resources implications for the United States of uncertainties in climate that may result from both anthropogenic and natural changes of the Earth's atmosphere. Like most past and current water-resources programs of the Geological Survey, the climate-uncertainty program should be composed of three elements: (1) research, (2) data collection, and (3) interpretive studies. However, unlike most other programs, the climate-uncertainty program necessarily will be dominated by its research component during its early years. Critical new concerns to be addressed by the research component are (1) areal estimates of evapotranspiration, (2) hydrologic resolution within atmospheric (climatic) models at the global scale and at mesoscales, (3) linkages between hydrology and climatology, and (4) methodology for the design of data networks that will help to track the impacts of climate change on water resources. Other ongoing activities in U.S. Geological Survey research programs will be enhanced to make them more compatible with climate-uncertainty research needs. The existing hydrologic data base of the Geological Survey serves as a key element in assessing hydrologic and climatologic change. However, this data base has evolved in response to other needs for hydrologic information and probably is not as sensitive to climate change as is desirable. Therefore, as measurement and network-design methodologies are improved to account for climate-change potential, new data-collection activities will be added to the existing programs. One particular area of data-collection concern pertains to the phenomenon of evapotranspiration. Interpretive studies of the hydrologic implications of climate uncertainty will be initiated by establishing several studies at the river-basin scale in diverse hydroclimatic and demographic settings. These studies will serve as tests of the existing methodologies for studying the impacts of climate change and also will help to define subsequent research priorities. A prototype for these studies was initiated in early 1988 in the Delaware River basin.

  12. The no-project alternative analysis: An early product of the Tahoe Decision Support System

    USGS Publications Warehouse

    Halsing, David L.; Hessenflow, Mark L.; Wein, Anne

    2005-01-01

    We report on the development of a No-project alternative analysis (NPAA) or “business as usual” scenario with respect to a 20-year projection of 21 indicators of environmental and socioeconomic conditions in the Lake Tahoe Basin for the Tahoe Regional Planning Agency (TRPA). Our effort was inspired by earlier work that investigated the tradeoffs between an environmental and an economic objective. The NPAA study has implications for a longer term goal of building a Tahoe Decision Support System (TDSS) to assist the TRPA and other Basin agencies in assessing the outcomes of management strategies. The NPAA assumes no major deviations from current management practices or from recent environmental or societal trends and planned Environmental Improvement Program (EIP) projects. Quantitative “scenario generation” tools were constructed to simulate site-specific land uses, various population categories, and associated vehicle miles traveled. Projections of each indicator’s attainment status were made by building visual conceptual models of the relevant natural and social processes, extrapolating trends, and using available models, research, and expert opinion. We present results of the NPAA, projected indicator status, key factors affecting the indicators, indicator functionality, and knowledge gaps. One important result is that current management practices may slow the loss or degradation of environmental qualities but not halt or reverse it. Our analysis also predicts an increase in recreation and commuting into and within the basin, primarily in private vehicles. Private vehicles, which are a critical mechanism by which the Basin population affects the surrounding environment, are a key determinant of air-quality indicators, a source of particulate matter affecting Secchi depth, a source of noise, and a factor in recreational and scenic quality, largely owing to congestion. Key uncertainties in the NPAA include climate change, EIP project effectiveness, and external population, economic activity, and air pollution.

  13. The carbon cycle revisited

    NASA Technical Reports Server (NTRS)

    Bolin, Bert; Fung, Inez

    1992-01-01

    Discussions during the Global Change Institute indicated a need to present, in some detail and as accurately as possible, our present knowledge about the carbon cycle, the uncertainties in this knowledge, and the reasons for these uncertainties. We discuss basic issues of internal consistency within the carbon cycle, and end by summarizing the key unknowns.

  14. Environmental Uncertainty and Communication Network Complexity: A Cross-System, Cross-Cultural Test.

    ERIC Educational Resources Information Center

    Danowski, James

    An infographic model is proposed to account for the operation of systems within their information environments. Infographics is a communication paradigm used to indicate the clustering of information processing variables in communication systems. Four propositions concerning environmental uncertainty and internal communication network complexity,…

  15. Effect of Uncertainty on Deterministic Runway Scheduling

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam; Malik, Waqar; Jung, Yoon C.

    2012-01-01

    Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.

  16. Ramp time synchronization. [for NASA Deep Space Network

    NASA Technical Reports Server (NTRS)

    Hietzke, W.

    1979-01-01

    A new method of intercontinental clock synchronization has been developed and proposed for possible use by NASA's Deep Space Network (DSN), using a two-way/three-way radio link with a spacecraft. Analysis of preliminary data indicates that the real-time method has an uncertainty of 0.6 microsec, and it is very likely that further work will decrease the uncertainty. Also, the method is compatible with a variety of nonreal-time analysis techniques, which may reduce the uncertainty down to the tens of nanosecond range.

  17. A design methodology for nonlinear systems containing parameter uncertainty

    NASA Technical Reports Server (NTRS)

    Young, G. E.; Auslander, D. M.

    1983-01-01

    In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.

  18. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  19. Uncertainty in Climate Change Research: An Integrated Approach

    NASA Astrophysics Data System (ADS)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  20. A spatial assessment framework for evaluating flood risk under extreme climates.

    PubMed

    Chen, Yun; Liu, Rui; Barrett, Damian; Gao, Lei; Zhou, Mingwei; Renzullo, Luigi; Emelyanova, Irina

    2015-12-15

    Australian coal mines have been facing a major challenge of increasing risk of flooding caused by intensive rainfall events in recent years. In light of growing climate change concerns and the predicted escalation of flooding, estimating flood inundation risk becomes essential for understanding sustainable mine water management in the Australian mining sector. This research develops a spatial multi-criteria decision making prototype for the evaluation of flooding risk at a regional scale using the Bowen Basin and its surroundings in Queensland as a case study. Spatial gridded data, including climate, hydrology, topography, vegetation and soils, were collected and processed in ArcGIS. Several indices were derived based on time series of observations and spatial modeling taking account of extreme rainfall, evapotranspiration, stream flow, potential soil water retention, elevation and slope generated from a digital elevation model (DEM), as well as drainage density and proximity extracted from a river network. These spatial indices were weighted using the analytical hierarchy process (AHP) and integrated in an AHP-based suitability assessment (AHP-SA) model under the spatial risk evaluation framework. A regional flooding risk map was delineated to represent likely impacts of criterion indices at different risk levels, which was verified using the maximum inundation extent detectable by a time series of remote sensing imagery. The result provides baseline information to help Bowen Basin coal mines identify and assess flooding risk when making adaptation strategies and implementing mitigation measures in future. The framework and methodology developed in this research offers the Australian mining industry, and social and environmental studies around the world, an effective way to produce reliable assessment on flood risk for managing uncertainty in water availability under climate change. Copyright © 2015. Published by Elsevier B.V.

  1. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.

  2. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  3. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE PAGES

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  4. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    NASA Astrophysics Data System (ADS)

    Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  5. Overall uncertainty study of the hydrological impacts of climate change for a Canadian watershed

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, FrançOis P.; Poulin, Annie; Leconte, Robert

    2011-12-01

    General circulation models (GCMs) and greenhouse gas emissions scenarios (GGES) are generally considered to be the two major sources of uncertainty in quantifying the climate change impacts on hydrology. Other sources of uncertainty have been given less attention. This study considers overall uncertainty by combining results from an ensemble of two GGES, six GCMs, five GCM initial conditions, four downscaling techniques, three hydrological model structures, and 10 sets of hydrological model parameters. Each climate projection is equally weighted to predict the hydrology on a Canadian watershed for the 2081-2100 horizon. The results show that the choice of GCM is consistently a major contributor to uncertainty. However, other sources of uncertainty, such as the choice of a downscaling method and the GCM initial conditions, also have a comparable or even larger uncertainty for some hydrological variables. Uncertainties linked to GGES and the hydrological model structure are somewhat less than those related to GCMs and downscaling techniques. Uncertainty due to the hydrological model parameter selection has the least important contribution among all the variables considered. Overall, this research underlines the importance of adequately covering all sources of uncertainty. A failure to do so may result in moderately to severely biased climate change impact studies. Results further indicate that the major contributors to uncertainty vary depending on the hydrological variables selected, and that the methodology presented in this paper is successful at identifying the key sources of uncertainty to consider for a climate change impact study.

  6. Social network profiles as information sources for adolescents' offline relations.

    PubMed

    Courtois, Cédric; All, Anissa; Vanwynsberghe, Hadewijch

    2012-06-01

    This article presents the results of a study concerning the use of online profile pages by adolescents to know more about "offline" friends and acquaintances. Previous research has indicated that social networking sites (SNSs) are used to gather information on new online contacts. However, several studies have demonstrated a substantial overlap between offline and online social networks. Hence, we question whether online connections are meaningful in gathering information on offline friends and acquaintances. First, the results indicate that a combination of passive uncertainty reduction (monitoring a target's profile) and interactive uncertainty reduction (communication through the target's profile) explains a considerable amount of variance in the level of uncertainty about both friends and acquaintances. More specifically, adolescents generally get to know much more about their acquaintances. Second, the results of online uncertainty reduction positively affect the degree of self-disclosure, which is imperative in building a solid friend relation. Further, we find that uncertainty reduction strategies positively mediate the effect of social anxiety on the level of certainty about friends. This implies that socially anxious teenagers benefit from SNSs by getting the conditions right to build a more solid relation with their friends. Hence, we conclude that SNSs play a substantial role in today's adolescents' everyday interpersonal communication.

  7. Impact of Hydrogeological Uncertainty on Estimation of Environmental Risks Posed by Hydrocarbon Transportation Networks

    NASA Astrophysics Data System (ADS)

    Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.

    2017-11-01

    Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.

  8. Cross-boundary management between national parks and surrounding lands: A review and discussion

    NASA Astrophysics Data System (ADS)

    Schonewald-Cox, Christine; Buechner, Marybeth; Sauvajot, Raymond; Wilcox, Bruce A.

    1992-03-01

    Protecting biodiversity on public lands is difficult, requiring the management of a complex array of factors. This is especially true when the ecosystems in question are affected by, or extend onto, lands outside the boundaries of the protected area. In this article we review recent developments in the cross-boundary management of protected natural resources, such as parks, wildlife reserves, and designated wilderness areas. Five ecological and 11 anthropic techniques have been suggested for use in cross-boundary management. The categories are not mutually exclusive, but each is a distinct and representative approach, suggested by various authors from academic, managerial, and legal professions. The ecological strategies stress the collection of basic data and documentation of trends. The anthropic techniques stress the usefulness of cooperative guidelines and the need to develop a local constituency which supports park goals. However, the situation is complex and the needed strategies are often difficult to implement. Diverse park resources are influenced by events in surrounding lands. The complexity and variability of sources, the ecological systems under protection, and the uncertainty of the effects combine to produce situations for which there are no simple answers. The solution to coexistence of the park and surrounding land depends upon creative techniques and recommendations, many still forthcoming. Ecological, sociological, legal, and economic disciplines as well as the managing agency should all contribute to these recommendations. Platforms for change include legislation, institutional policies, communication, education, management techniques, and ethics.

  9. Calculating salt loads to Great Salt Lake and the associated uncertainties for water year 2013; updating a 48 year old standard

    USGS Publications Warehouse

    Shope, Christopher L.; Angeroth, Cory E.

    2015-01-01

    Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.

  10. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2014-09-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties in the end results. We estimate uncertainties in economic data, multi-pollutant emission statistics and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. The economic data have a relatively small impact on uncertainty at the global and national level, while much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production based emissions, since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±9-±27% using the global temperature potential with a 50 year time horizon, with metric uncertainties dominating. National level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9-±25%, with metric and emissions uncertainties contributing similarly. The Absolute global temperature potential with a 50 year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas; Burns, Joseph R.

    The aftermath of the Tōhoku earthquake and the Fukushima accident has led to a global push to improve the safety of existing light water reactors. A key component of this initiative is the development of nuclear fuel and cladding materials with potentially enhanced accident tolerance, also known as accident-tolerant fuels (ATF). These materials are intended to improve core fuel and cladding integrity under beyond design basis accident conditions while maintaining or enhancing reactor performance and safety characteristics during normal operation. To complement research that has already been carried out to characterize ATF neutronics, the present study provides an initial investigationmore » of the sensitivity and uncertainty of ATF systems responses to nuclear cross section data. ATF concepts incorporate novel materials, including SiC and FeCrAl cladding and high density uranium silicide composite fuels, in turn introducing new cross section sensitivities and uncertainties which may behave differently from traditional fuel and cladding materials. In this paper, we conducted sensitivity and uncertainty analysis using the TSUNAMI-2D sequence of SCALE with infinite lattice models of ATF assemblies. Of all the ATF materials considered, it is found that radiative capture in 56Fe in FeCrAl cladding is the most significant contributor to eigenvalue uncertainty. 56Fe yields significant potential eigenvalue uncertainty associated with its radiative capture cross section; this is by far the largest ATF-specific uncertainty found in these cases, exceeding even those of uranium. We found that while significant new sensitivities indeed arise, the general sensitivity behavior of ATF assemblies does not markedly differ from traditional UO2/zirconium-based fuel/cladding systems, especially with regard to uncertainties associated with uranium. We assessed the similarity of the IPEN/MB-01 reactor benchmark model to application models with FeCrAl cladding. We used TSUNAMI-IP to calculate similarity indices of the application model and IPEN/MB-01 reactor benchmark model. This benchmark was selected for its use of SS304 as a cladding and structural material, with significant 56Fe content. The similarity indices suggest that while many differences in reactor physics arise from differences in design, sensitivity to and behavior of 56Fe absorption is comparable between systems, thus indicating the potential for this benchmark to reduce uncertainties in 56Fe radiative capture cross sections.« less

  12. A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty

    NASA Astrophysics Data System (ADS)

    Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl

    2012-05-01

    The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.

  13. Uncertainty loops in travel-time tomography from nonlinear wave physics.

    PubMed

    Galetti, Erica; Curtis, Andrew; Meles, Giovanni Angelo; Baptie, Brian

    2015-04-10

    Estimating image uncertainty is fundamental to guiding the interpretation of geoscientific tomographic maps. We reveal novel uncertainty topologies (loops) which indicate that while the speeds of both low- and high-velocity anomalies may be well constrained, their locations tend to remain uncertain. The effect is widespread: loops dominate around a third of United Kingdom Love wave tomographic uncertainties, changing the nature of interpretation of the observed anomalies. Loops exist due to 2nd and higher order aspects of wave physics; hence, although such structures must exist in many tomographic studies in the physical sciences and medicine, they are unobservable using standard linearized methods. Higher order methods might fruitfully be adopted.

  14. Measuring high-density built environment for public health research: Uncertainty with respect to data, indicator design and spatial scale.

    PubMed

    Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu

    2018-05-07

    Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.

  15. VizieR Online Data Catalog: Spectral properties of 441 radio pulsars (Jankowski+, 2018)

    NASA Astrophysics Data System (ADS)

    Jankowski, F.; van Straten, W.; Keane, E. F.; Bailes, M.; Barr, E. D.; Johnston, S.; Kerr, M.

    2018-03-01

    We present spectral parameters for 441 radio pulsars. These were obtained from observations centred at 728, 1382 and 3100MHz using the 10-50cm and the 20cm multibeam receiver at the Parkes radio telescope. In particular, we list the pulsar names (J2000), the calibrated, band-integrated flux densities at 728, 1382 and 3100MHz, the spectral classifications, the frequency ranges the spectral classifications were performed over, the spectral indices for pulsars with simple power-law spectra and the robust modulation indices at all three centre frequencies for pulsars of which we have at least six measurement epochs. The flux density uncertainties include scintillation and a systematic contribution, in addition to the statistical uncertainty. Upper limits are reported at the 3σ level and all other uncertainties at the 1σ level. (1 data file).

  16. Beating Heart Motion Accurate Prediction Method Based on Interactive Multiple Model: An Information Fusion Approach

    PubMed Central

    Xie, Weihong; Yu, Yang

    2017-01-01

    Robot-assisted motion compensated beating heart surgery has the advantage over the conventional Coronary Artery Bypass Graft (CABG) in terms of reduced trauma to the surrounding structures that leads to shortened recovery time. The severe nonlinear and diverse nature of irregular heart rhythm causes enormous difficulty for the robot to realize the clinic requirements, especially under arrhythmias. In this paper, we propose a fusion prediction framework based on Interactive Multiple Model (IMM) estimator, allowing each model to cover a distinguishing feature of the heart motion in underlying dynamics. We find that, at normal state, the nonlinearity of the heart motion with slow time-variant changing dominates the beating process. When an arrhythmia occurs, the irregularity mode, the fast uncertainties with random patterns become the leading factor of the heart motion. We deal with prediction problem in the case of arrhythmias by estimating the state with two behavior modes which can adaptively “switch” from one to the other. Also, we employed the signal quality index to adaptively determine the switch transition probability in the framework of IMM. We conduct comparative experiments to evaluate the proposed approach with four distinguished datasets. The test results indicate that the new proposed approach reduces prediction errors significantly. PMID:29124062

  17. Beating Heart Motion Accurate Prediction Method Based on Interactive Multiple Model: An Information Fusion Approach.

    PubMed

    Liang, Fan; Xie, Weihong; Yu, Yang

    2017-01-01

    Robot-assisted motion compensated beating heart surgery has the advantage over the conventional Coronary Artery Bypass Graft (CABG) in terms of reduced trauma to the surrounding structures that leads to shortened recovery time. The severe nonlinear and diverse nature of irregular heart rhythm causes enormous difficulty for the robot to realize the clinic requirements, especially under arrhythmias. In this paper, we propose a fusion prediction framework based on Interactive Multiple Model (IMM) estimator, allowing each model to cover a distinguishing feature of the heart motion in underlying dynamics. We find that, at normal state, the nonlinearity of the heart motion with slow time-variant changing dominates the beating process. When an arrhythmia occurs, the irregularity mode, the fast uncertainties with random patterns become the leading factor of the heart motion. We deal with prediction problem in the case of arrhythmias by estimating the state with two behavior modes which can adaptively "switch" from one to the other. Also, we employed the signal quality index to adaptively determine the switch transition probability in the framework of IMM. We conduct comparative experiments to evaluate the proposed approach with four distinguished datasets. The test results indicate that the new proposed approach reduces prediction errors significantly.

  18. Starburst-driven Superwinds in Quasar Host Galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barthel, Peter; Podigachoski, Pece; Wilkes, Belinda

    2017-07-01

    During the past five decades astronomers have been puzzled by the presence of strong absorption features including metal lines, observed in the optical and ultraviolet spectra of quasars, signaling inflowing and outflowing gas winds with relative velocities up to several thousands of km s{sup −1}. In particular, the location of these winds—close to the quasar, further out in its host galaxy, or in its direct environment—and the possible impact on their surroundings have been issues of intense discussion and uncertainty. Using our Herschel Space Observatory data, we report a tendency for this so-called associated metal absorption to occur along withmore » prodigious star formation in the quasar host galaxy, indicating that the two phenomena are likely to be interrelated, that the gas winds likely occur on the kiloparsec scale and would then have a strong impact on the interstellar medium of the galaxy. This correlation moreover would imply that the unusually high cold dust luminosities in these quasars are connected with ongoing star formation. Given that we find no correlation with the AGN strength, the wind feedback that we establish in these radio-loud objects is most likely associated with their host star formation rather than with their black hole accretion.« less

  19. Influence of methane emissions and vehicle efficiency on the climate implications of heavy-duty natural gas trucks.

    PubMed

    Camuzeaux, Jonathan R; Alvarez, Ramón A; Brooks, Susanne A; Browne, Joshua B; Sterner, Thomas

    2015-06-02

    While natural gas produces lower carbon dioxide emissions than diesel during combustion, if enough methane is emitted across the fuel cycle, then switching a heavy-duty truck fleet from diesel to natural gas can produce net climate damages (more radiative forcing) for decades. Using the Technology Warming Potential methodology, we assess the climate implications of a diesel to natural gas switch in heavy-duty trucks. We consider spark ignition (SI) and high-pressure direct injection (HPDI) natural gas engines and compressed and liquefied natural gas. Given uncertainty surrounding several key assumptions and the potential for technology to evolve, results are evaluated for a range of inputs for well-to-pump natural gas loss rates, vehicle efficiency, and pump-to-wheels (in-use) methane emissions. Using reference case assumptions reflecting currently available data, we find that converting heavy-duty truck fleets leads to damages to the climate for several decades: around 70-90 years for the SI cases, and 50 years for the more efficient HPDI. Our range of results indicates that these fuel switches have the potential to produce climate benefits on all time frames, but combinations of significant well-to-wheels methane emissions reductions and natural gas vehicle efficiency improvements would be required.

  20. A Bayesian phylogenetic study of the Dravidian language family

    PubMed Central

    Kolipakam, Vishnupriya

    2018-01-01

    The Dravidian language family consists of about 80 varieties (Hammarström H. 2016 Glottolog 2.7) spoken by 220 million people across southern and central India and surrounding countries (Steever SB. 1998 In The Dravidian languages (ed. SB Steever), pp. 1–39: 1). Neither the geographical origin of the Dravidian language homeland nor its exact dispersal through time are known. The history of these languages is crucial for understanding prehistory in Eurasia, because despite their current restricted range, these languages played a significant role in influencing other language groups including Indo-Aryan (Indo-European) and Munda (Austroasiatic) speakers. Here, we report the results of a Bayesian phylogenetic analysis of cognate-coded lexical data, elicited first hand from native speakers, to investigate the subgrouping of the Dravidian language family, and provide dates for the major points of diversification. Our results indicate that the Dravidian language family is approximately 4500 years old, a finding that corresponds well with earlier linguistic and archaeological studies. The main branches of the Dravidian language family (North, Central, South I, South II) are recovered, although the placement of languages within these main branches diverges from previous classifications. We find considerable uncertainty with regard to the relationships between the main branches. PMID:29657761

  1. An experimental investigation devoted to determine heat transfer characteristics in a radiant ceiling heating system

    NASA Astrophysics Data System (ADS)

    Koca, Aliihsan; Acikgoz, Ozgen; Çebi, Alican; Çetin, Gürsel; Dalkilic, Ahmet Selim; Wongwises, Somchai

    2018-02-01

    Investigations on heated ceiling method can be considered as a new research area in comparison to the common wall heating-cooling and cooled ceiling methods. In this work, heat transfer characteristics of a heated radiant ceiling system was investigated experimentally. There were different configurations for a single room design in order to determine the convective and radiative heat transfer rates. Almost all details on the arrangement of the test chamber, hydraulic circuit and radiant panels, the measurement equipment and experimental method including uncertainty analysis were revealed in detail indicating specific international standards. Total heat transfer amount from the panels were calculated as the sum of radiation to the unheated surfaces, convection to the air, and conduction heat loss from the backside of the panels. Integral expression of the view factors was calculated by means of the numerical evaluations using Matlab code. By means of this experimental chamber, the radiative, convective and total heat-transfer coefficient values along with the heat flux values provided from the ceiling to the unheated surrounding surfaces have been calculated. Moreover, the details of 28 different experimental case study measurements from the experimental chamber including the convective, radiative and total heat flux, and heat output results are given in a Table for other researchers to validate their theoretical models and empirical correlations.

  2. Skeletal mineralogy of newly settling Acropora millepora (Scleractinia) coral recruits

    NASA Astrophysics Data System (ADS)

    Clode, P. L.; Lema, K.; Saunders, M.; Weiner, S.

    2011-03-01

    Knowledge of skeletogenesis in scleractinian corals is central to reconstructing past ocean and climate histories, assessing and counteracting future climate and ocean acidification impacts upon coral reefs, and determining the taxonomy and evolutionary path of the Scleractinia. To better understand skeletogenesis and mineralogy in extant scleractinian corals, we have investigated the nature of the initial calcium carbonate skeleton deposited by newly settling coral recruits. Settling Acropora millepora larvae were sampled daily for 10 days from initial attachment, and the carbonate mineralogy of their newly deposited skeletons was investigated. Bulk analyses using Raman and infrared spectroscopic methods revealed that the skeletons were predominantly comprised of aragonite, with no evidence of calcite or an amorphous precursor phase, although presence of the latter cannot be discounted. Sensitive selected area electron diffraction analyses of sub-micron areas of skeletal regions further consolidated these data. These findings help to address the uncertainty surrounding reported differences in carbonate mineralogy between larval and adult extant coral skeletons by indicating that skeletons of new coral recruits share the same aragonitic mineralogy as those of their mature counterparts. In this respect, we can expect that skeletogenesis in both larval and mature growth stages of scleractinian corals will be similarly affected by ocean acidification and predicted environmental changes.

  3. Matching cue size and task properties in exogenous attention.

    PubMed

    Burnett, Katherine E; d'Avossa, Giovanni; Sapir, Ayelet

    2013-01-01

    Exogenous attention is an involuntary, reflexive orienting response that results in enhanced processing at the attended location. The standard view is that this enhancement generalizes across visual properties of a stimulus. We test whether the size of an exogenous cue sets the attentional field and whether this leads to different effects on stimuli with different visual properties. In a dual task with a random-dot kinematogram (RDK) in each quadrant of the screen, participants discriminated the direction of moving dots in one RDK and localized one red dot. Precues were uninformative and consisted of either a large or a small luminance-change frame. The motion discrimination task showed attentional effects following both large and small exogenous cues. The red dot probe localization task showed attentional effects following a small cue, but not a large cue. Two additional experiments showed that the different effects on localization were not due to reduced spatial uncertainty or suppression of RDK dots in the surround. These results indicate that the effects of exogenous attention depend on the size of the cue and the properties of the task, suggesting the involvement of receptive fields with different sizes in different tasks. These attentional effects are likely to be driven by bottom-up mechanisms in early visual areas.

  4. Measurement uncertainty: Friend or foe?

    PubMed

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Multi-scale landslide hazard assessment: Advances in global and regional methodologies

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang

    2010-05-01

    The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.

  6. MODIS land cover uncertainty in regional climate simulations

    NASA Astrophysics Data System (ADS)

    Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.

    2017-12-01

    MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.

  7. Using spatial uncertainty to manipulate the size of the attention focus.

    PubMed

    Huang, Dan; Xue, Linyan; Wang, Xin; Chen, Yao

    2016-09-01

    Preferentially processing behaviorally relevant information is vital for primate survival. In visuospatial attention studies, manipulating the spatial extent of attention focus is an important question. Although many studies have claimed to successfully adjust attention field size by either varying the uncertainty about the target location (spatial uncertainty) or adjusting the size of the cue orienting the attention focus, no systematic studies have assessed and compared the effectiveness of these methods. We used a multiple cue paradigm with 2.5° and 7.5° rings centered around a target position to measure the cue size effect, while the spatial uncertainty levels were manipulated by changing the number of cueing positions. We found that spatial uncertainty had a significant impact on reaction time during target detection, while the cue size effect was less robust. We also carefully varied the spatial scope of potential target locations within a small or large region and found that this amount of variation in spatial uncertainty can also significantly influence target detection speed. Our results indicate that adjusting spatial uncertainty is more effective than varying cue size when manipulating attention field size.

  8. Inertial acceleration as a measure of linear vection: An alternative to magnitude estimation. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Carpenter-Smith, Theodore R.; Futamura, Robert G.; Parker, Donald E.

    1995-01-01

    The present study focused on the development of a procedure to assess perceived self-motion induced by visual surround motion - vection. Using an apparatus that permitted independent control of visual and inertial stimuli, prone observers were translated along their head x-axis (fore/aft). The observers' task was to report the direction of self-motion during passive forward and backward translations of their bodies coupled with exposure to various visual surround conditions. The proportion of 'forward' responses was used to calculate each observer's point of subjective equality (PSE) for each surround condition. The results showed that the moving visual stimulus produced a significant shift in the PSE when data from the moving surround condition were compared with the stationary surround and no-vision condition. Further, the results indicated that vection increased monotonically with surround velocities between 4 and 40/s. It was concluded that linear vection can be measured in terms of changes in the amplitude of whole-body inertial acceleration required to elicit equivalent numbers of 'forward' and 'backward' self-motion reports.

  9. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  10. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  11. Model Insensitive and Calibration Independent Method for Determination of the Downstream Neutral Hydrogen Density Through Ly-alpha Glow Observations

    NASA Technical Reports Server (NTRS)

    Gangopadhyay, P.; Judge, D. L.

    1996-01-01

    Our knowledge of the various heliospheric phenomena (location of the solar wind termination shock, heliopause configuration and very local interstellar medium parameters) is limited by uncertainties in the available heliospheric plasma models and by calibration uncertainties in the observing instruments. There is, thus, a strong motivation to develop model insensitive and calibration independent methods to reduce the uncertainties in the relevant heliospheric parameters. We have developed such a method to constrain the downstream neutral hydrogen density inside the heliospheric tail. In our approach we have taken advantage of the relative insensitivity of the downstream neutral hydrogen density profile to the specific plasma model adopted. We have also used the fact that the presence of an asymmetric neutral hydrogen cavity surrounding the sun, characteristic of all neutral densities models, results in a higher multiple scattering contribution to the observed glow in the downstream region than in the upstream region. This allows us to approximate the actual density profile with one which is spatially uniform for the purpose of calculating the downstream backscattered glow. Using different spatially constant density profiles, radiative transfer calculations are performed, and the radial dependence of the predicted glow is compared with the observed I/R dependence of Pioneer 10 UV data. Such a comparison bounds the large distance heliospheric neutral hydrogen density in the downstream direction to a value between 0.05 and 0.1/cc.

  12. A revised ground-motion and intensity interpolation scheme for shakemap

    USGS Publications Warehouse

    Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.

    2010-01-01

    We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.

  13. Comparative analysis of the labelling of nanotechnologies across four stakeholder groups

    NASA Astrophysics Data System (ADS)

    Capon, Adam; Gillespie, James; Rolfe, Margaret; Smith, Wayne

    2015-08-01

    Societies are constantly challenged to develop policies around the introduction of new technologies, which by their very nature contain great uncertainty. This uncertainty gives prominence to varying viewpoints which are value laden and have the ability to drastically shift policy. The issue of nanotechnologies is a prime example. The labelling of products that contain new technologies has been one policy tool governments have used to address concerns around uncertainty. Our study develops evidence regarding opinions on the labelling of products made by nanotechnologies. We undertook a computer-assisted telephone (CATI) survey of the Australian public and those involved in nanotechnologies from the academic, business and government sectors using a standardised questionnaire. Analysis was undertaken using descriptive and logistic regression techniques. We explored reluctance to purchase as a result of labelling products which contained manufactured nanomaterials both generally and across five broad products (food, cosmetics/sunscreens, medicines, pesticides, tennis racquets/computers) which represent the broad categories of products regulated by differing government agencies in Australia. We examined the relationship between reluctance to purchase and risk perception, trust, and familiarity. We found irrespective of stakeholder, most supported the labelling of products which contained manufactured nanomaterials. Perception of risk was the main driver of reluctance to purchase, while trust and familiarity were likely to have an indirect effect through risk perception. Food is likely to be the greatest product impacted by labelling. Risk perception surrounding nanotechnologies and label `framing' on the product are key issues to be addressed in the implementation of a labelling scheme.

  14. Effects of physical and biogeochemical processes on aquatic ecosystems at the groundwater-surface water interface: An evaluation of a sulfate-impacted wild rice stream in Minnesota (USA)

    NASA Astrophysics Data System (ADS)

    Ng, G. H. C.; Yourd, A. R.; Myrbo, A.; Johnson, N.

    2015-12-01

    Significant uncertainty and variability in physical and biogeochemical processes at the groundwater-surface water interface complicate how surface water chemistry affects aquatic ecosystems. Questions surrounding a unique 10 mg/L sulfate standard for wild rice (Zizania sp.) waters in Minnesota are driving research to clarify conditions controlling the geochemistry of shallow sediment porewater in stream- and lake-beds. This issue raises the need and opportunity to carry out in-depth, process-based analysis into how water fluxes and coupled C, S, and Fe redox cycles interact to impact aquatic plants. Our study builds on a recent state-wide field campaign that showed that accumulation of porewater sulfide from sulfate reduction impairs wild rice, an annual grass that grows in shallow lakes and streams in the Great Lakes region of North America. Negative porewater sulfide correlations with organic C and Fe quantities also indicated that lower redox rates and greater mineral precipitation attenuate sulfide. Here, we focus on a stream in northern Minnesota that receives high sulfate loading from iron mining activity yet maintains wild rice stands. In addition to organic C and Fe effects, we evaluate the degree to which streambed hydrology, and in particular groundwater contributions, accounts for the active biogeochemistry. We collect field measurements, spanning the surrounding groundwater system to the stream, to constrain a reactive-transport model. Observations from seepage meters, temperature probes, and monitoring wells delineate upward flow that may lessen surface water impacts below the stream. Geochemical analyses of groundwater, porewater, and surface water samples and of sediment extractions reveal distinctions among the different domains and stream banks, which appear to jointly control conditions in the streambed. A model based on field conditions can be used to evaluate the relative the importance and the spatiotemporal scales of diverse flux and geochemical factors affecting aquatic root zones.

  15. Constraining planetary atmospheric density: application of heuristic search algorithms to aerodynamic modeling of impact ejecta trajectories

    NASA Astrophysics Data System (ADS)

    Liu, Z. Y. C.; Shirzaei, M.

    2015-12-01

    Impact craters on the terrestrial planets are typically surrounded by a continuous ejecta blanket that the initial emplacement is via ballistic sedimentation. Following an impact event, a significant volume of material is ejected and falling debris surrounds the crater. Aerodynamics rule governs the flight path and determines the spatial distribution of these ejecta. Thus, for the planets with atmosphere, the preserved ejecta deposit directly recorded the interaction of ejecta and atmosphere at the time of impact. In this study, we develop a new framework to establish links between distribution of the ejecta, age of the impact and the properties of local atmosphere. Given the radial distance of the continuous ejecta extent from crater, an inverse aerodynamic modeling approach is employed to estimate the local atmospheric drags and density as well as the lift forces at the time of impact. Based on earlier studies, we incorporate reasonable value ranges for ejection angle, initial velocity, aerodynamic drag, and lift in the model. In order to solve the trajectory differential equations, obtain the best estimate of atmospheric density, and the associated uncertainties, genetic algorithm is applied. The method is validated using synthetic data sets as well as detailed maps of impact ejecta associated with five fresh martian and two lunar impact craters, with diameter of 20-50 m, 10-20 m, respectively. The estimated air density for martian carters range 0.014-0.028 kg/m3, consistent with the recent surface atmospheric density measurement of 0.015-0.020 kg/m3. This constancy indicates the robustness of the presented methodology. In the following, the inversion results for the lunar craters yield air density of 0.003-0.008 kg/m3, which suggest the inversion results are accurate to the second decimal place. This framework will be applied to older martian craters with preserved ejecta blankets, which expect to constrain the long-term evolution of martian atmosphere.

  16. Recognition of upper airway and surrounding structures at MRI in pediatric PCOS and OSAS

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, J. K.; Odhner, D.; Sin, Sanghun; Arens, Raanan

    2013-03-01

    Obstructive Sleep Apnea Syndrome (OSAS) is common in obese children with risk being 4.5 fold compared to normal control subjects. Polycystic Ovary Syndrome (PCOS) has recently been shown to be associated with OSAS that may further lead to significant cardiovascular and neuro-cognitive deficits. We are investigating image-based biomarkers to understand the architectural and dynamic changes in the upper airway and the surrounding hard and soft tissue structures via MRI in obese teenage children to study OSAS. At the previous SPIE conferences, we presented methods underlying Fuzzy Object Models (FOMs) for Automatic Anatomy Recognition (AAR) based on CT images of the thorax and the abdomen. The purpose of this paper is to demonstrate that the AAR approach is applicable to a different body region and image modality combination, namely in the study of upper airway structures via MRI. FOMs were built hierarchically, the smaller sub-objects forming the offspring of larger parent objects. FOMs encode the uncertainty and variability present in the form and relationships among the objects over a study population. Totally 11 basic objects (17 including composite) were modeled. Automatic recognition for the best pose of FOMs in a given image was implemented by using four methods - a one-shot method that does not require search, another three searching methods that include Fisher Linear Discriminate (FLD), a b-scale energy optimization strategy, and optimum threshold recognition method. In all, 30 multi-fold cross validation experiments based on 15 patient MRI data sets were carried out to assess the accuracy of recognition. The results indicate that the objects can be recognized with an average location error of less than 5 mm or 2-3 voxels. Then the iterative relative fuzzy connectedness (IRFC) algorithm was adopted for delineation of the target organs based on the recognized results. The delineation results showed an overall FP and TP volume fraction of 0.02 and 0.93.

  17. Simulated impacts of SO 2 emissions from the Miyake volcano on concentration and deposition of sulfur oxides in September and October of 2000

    NASA Astrophysics Data System (ADS)

    An, Junling; Ueda, Hiromasa; Matsuda, Kazuhide; Hasome, Hisashi; Iwata, Motokazu

    A regional air quality Eulerian model was run for 2 months (September and October of 2000) with and without SO 2 emissions from the Miyake volcano to investigate effects of the changes in the volcanic emissions on SO 2 and sulfate concentrations and total sulfur deposition around the surrounding areas. Volcanic emissions were injected into different model layers in different proportions within the planetary boundary layer whereas the other emissions were released in the first model layer above the ground. Meteorological fields four times per day were taken from National Centers for Environmental Prediction (NCEP). Eight Japanese monitoring sites of EANET (Acid Deposition Monitoring Network in East Asia) were used for the model evaluation. Simulations indicate that emissions from the Miyake volcano lead to increases in SO 2 and sulfate concentrations in the surrounding areas downwind in the PBL by up to 300% and 150%, respectively, and those in SO 2 levels in the area found ˜390 km north away from the Miyake site in the free troposphere (FTR) by up to 120%. Total sulfur deposition amounts per month are also increased by up to 300%. Daily SO 2 concentrations in different model layers display strong variability (10-450%) at sites significantly influenced by the volcano. Comparison shows that the RAQM model predicts daily SO 2 variations at relatively clean sites better than those at inland sites closer to volcanoes and the model well captures the timing of SO 2 peaks caused by great changes in SO 2 emissions from the Miyake volcano at most chosen sites and that monthly simulated sulfate concentrations in rainwater agree quite well with observations with the difference within a factor of 2. Improvement in spatial and temporal resolutions of meteorological data and removal of the uncertainty of other volcanic emissions may better simulations.

  18. Reassessing biases and other uncertainties in sea surface temperature observations measured in situ since 1850: 2. Biases and homogenization

    NASA Astrophysics Data System (ADS)

    Kennedy, J. J.; Rayner, N. A.; Smith, R. O.; Parker, D. E.; Saunby, M.

    2011-07-01

    Changes in instrumentation and data availability have caused time-varying biases in estimates of global and regional average sea surface temperature. The size of the biases arising from these changes are estimated and their uncertainties evaluated. The estimated biases and their associated uncertainties are largest during the period immediately following the Second World War, reflecting the rapid and incompletely documented changes in shipping and data availability at the time. Adjustments have been applied to reduce these effects in gridded data sets of sea surface temperature and the results are presented as a set of interchangeable realizations. Uncertainties of estimated trends in global and regional average sea surface temperature due to bias adjustments since the Second World War are found to be larger than uncertainties arising from the choice of analysis technique, indicating that this is an important source of uncertainty in analyses of historical sea surface temperatures. Despite this, trends over the twentieth century remain qualitatively consistent.

  19. Studying the effect of clinical uncertainty on physicians' decision-making using ILIAD.

    PubMed

    Anderson, J D; Jay, S J; Weng, H C; Anderson, M M

    1995-01-01

    The influence of uncertainty on physicians' practice behavior is not well understood. In this research, ILIAD, a diagnostic expert system, has been used to study physicians' responses to uncertainty and how their responses affected clinical performance. The simulation mode of ILIAD was used to standardize the presentation and scoring of two cases to 46 residents in emergency medicine, internal medicine, family practice and transitional medicine at Methodist Hospital of Indiana. A questionnaire was used to collect additional data on how physicians respond to clinical uncertainty. A structural equation model was developed, estimated, and tested. The results indicate that stress that physicians experience in dealing with clinical uncertainty has a negative effect on their clinical performance. Moreover, the way that physicians respond to uncertainty has positive and negative effects on their performance. Open discussions with patients about clinical decisions and the use of practice guidelines improves performance. However, when the physician's clinical decisions are influenced by patient demands or their peers, their performance scores decline.

  20. Ethical considerations in genomic testing for hematologic disorders.

    PubMed

    Marron, Jonathan M; Joffe, Steven

    2017-07-27

    As our technological capacities improve, genomic testing is increasingly integrating into patient care. The field of clinical hematology is no exception. Genomic testing carries great promise, but several ethical issues must be considered whenever such testing is performed. This review addresses these ethical considerations, including issues surrounding informed consent and the uncertainty of the results of genomic testing; the challenge of incidental findings; and possible inequities in access to and benefit from such testing. Genomic testing is likely to transform the practice of both benign and malignant hematology, but clinicians must carefully consider these core ethical issues in order to make the most of this exciting and evolving technology. © 2017 by The American Society of Hematology.

  1. Government health policy and the diffusion of new medical devices.

    PubMed Central

    Hillman, B J

    1986-01-01

    The combination of absent financial incentives, aspects of physicians' clinical training, and the uncertainty surrounding the appropriate application of expensive new medical devices have been the most significant factors in promoting their wasteful diffusion and use. This presentation summarizes the forces that have resulted in regulatory and reimbursement initiatives to make more efficient the acquisition and utilization of new medical devices. The case histories of computed tomography (CT) and magnetic resonance imaging (MRI) serve as a paradigm demonstrating why such initiatives have thus far proved ineffectual. More effective would be to abandon distinctions between inpatient and outpatient reimbursement for using new medical devices and to improve the relationship between reimbursement and technology assessment. PMID:3818311

  2. Public policy action and CCC implementation: benefits and hurdles

    PubMed Central

    Daniel, Kelley; Gurian, Gary L.; Petherick, J. T.; Stockmyer, Chris; David, Annette M.; Miller, Sara E.

    2010-01-01

    Policy change continues to be an increasingly effective means of advancing the agenda of comprehensive cancer control. Efforts have moved progressively from describing how public policy can enhance the comprehensive cancer control agenda to implementation of public policy best practices at both the state and federal levels. The current political and economic contexts bring additional challenges and opportunities to the efforts surrounding comprehensive cancer control and policy. The purpose of this paper is to highlight recent policy successes, to illustrate the importance of policy as a means of advancing the comprehensive cancer control agenda, and to discuss continued policy action as we move forward in a time of healthcare reform and continuing economic uncertainty. PMID:21086034

  3. Accounting for susceptibility in risk assessment: The need for full disclosure.

    PubMed

    Fowle Iii, J R

    1997-12-01

    Many Environmental Laws create the unrealistic expectation that science can be used to determine `safety'. The many uncertainties surrounding environmental risks, as well as individual, group and societal differences about what is considered `safe', make it inevitable that policy decisions must be made. It is appropriate that such decisions be shaped by politics and social issues, as well as be informed by science and economics, but care should be taken to distinguish between policy and fact. Not much is known about the nature and magnitude of environmental susceptibilities. Credible environmental decisions require that scientists, risk assessors and decision-makers acknowledge this, and that they take care to distinguish policy calls from scientific fact.

  4. Strong stellar winds.

    PubMed

    Conti, P S; McCray, R

    1980-04-04

    The hottest and most luminous stars lose a substantial fraction of their mass in strong stellar winds. These winds not only affect the evolution of the star, they also carve huge expanding cavities in the surrounding interstellar medium, possibly affecting star formation. The winds are probably driven by radiation pressure, but uncertainties persist in their theoretical description. Strong x-ray sources associated with a few of these hot stars may be used to probe the stellar winds. The nature of the weak x-ray sources recently observed to be associated with many of these stars is uncertain. It is suggested that roughly 10 percent of the luminous hot stars may have as companions neutron stars or black holes orbiting within the stellar winds.

  5. Using scenario analysis to determine managed care strategy.

    PubMed

    Krentz, S E; Gish, R S

    2000-09-01

    In today's volatile healthcare environment, traditional planning tools are inadequate to guide financial managers of provider organizations in developing managed care strategies. These tools often disregard the uncertainty surrounding market forces such as employee benefit structure, the future of Medicare managed care, and the impact of consumer behavior. Scenario analysis overcomes this limitation by acknowledging the uncertain healthcare environment and articulating a set of plausible alternative futures, thus supplying financial executives with the perspective to craft strategies that can improve the market position of their organizations. By being alert for trigger points that might signal the rise of a specific scenario, financial managers can increase their preparedness for changes in market forces.

  6. The evolution of financial incentives in the U.S. health care system.

    PubMed

    Darves-Bornoz, Annie L; Resnick, Matthew J

    2017-01-01

    The U.S. health care system continues to evolve toward value-based payment, rewarding providers based upon outcomes per dollar spent. To date, payment innovation has largely targeted primary care, with little consideration for the role of surgical specialists. As such, there remains appropriate uncertainty surrounding the optimal role of the urologic oncologist in alternative payment models. This commentary summarizes the context of U.S. health care reform and offers insights into supply-side innovations including accountable care organizations and bundled payments. Additionally, and importantly, we discuss the implications of rising out-of-pocket health care expenditures giving rise to health care consumerism and the implications therein. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Public policy action and CCC implementation: benefits and hurdles.

    PubMed

    Steger, Carter; Daniel, Kelley; Gurian, Gary L; Petherick, J T; Stockmyer, Chris; David, Annette M; Miller, Sara E

    2010-12-01

    Policy change continues to be an increasingly effective means of advancing the agenda of comprehensive cancer control. Efforts have moved progressively from describing how public policy can enhance the comprehensive cancer control agenda to implementation of public policy best practices at both the state and federal levels. The current political and economic contexts bring additional challenges and opportunities to the efforts surrounding comprehensive cancer control and policy. The purpose of this paper is to highlight recent policy successes, to illustrate the importance of policy as a means of advancing the comprehensive cancer control agenda, and to discuss continued policy action as we move forward in a time of healthcare reform and continuing economic uncertainty.

  8. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  9. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  10. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2015-05-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties along the entire causal chain. We estimate uncertainties in economic data, multi-pollutant emission statistics, and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. Based on our assumptions, which exclude correlations in the economic data, the uncertainty in the economic data appears to have a relatively small impact on uncertainty at the national level in comparison to emissions and metric uncertainty. Much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production-based emissions since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±10 to ±27 % using the Global Temperature Potential with a 50-year time horizon, with metric uncertainties dominating. National-level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9 to ±25 %, with metric and emission uncertainties contributing similarly. The absolute global temperature potential (AGTP) with a 50-year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  11. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  12. Assessing uncertainty in ecological systems using global sensitivity analyses: a case example of simulated wolf reintroduction effects on elk

    USGS Publications Warehouse

    Fieberg, J.; Jenkins, Kurt J.

    2005-01-01

    Often landmark conservation decisions are made despite an incomplete knowledge of system behavior and inexact predictions of how complex ecosystems will respond to management actions. For example, predicting the feasibility and likely effects of restoring top-level carnivores such as the gray wolf (Canis lupus) to North American wilderness areas is hampered by incomplete knowledge of the predator-prey system processes and properties. In such cases, global sensitivity measures, such as Sobola?? indices, allow one to quantify the effect of these uncertainties on model predictions. Sobola?? indices are calculated by decomposing the variance in model predictions (due to parameter uncertainty) into main effects of model parameters and their higher order interactions. Model parameters with large sensitivity indices can then be identified for further study in order to improve predictive capabilities. Here, we illustrate the use of Sobola?? sensitivity indices to examine the effect of parameter uncertainty on the predicted decline of elk (Cervus elaphus) population sizes following a hypothetical reintroduction of wolves to Olympic National Park, Washington, USA. The strength of density dependence acting on survival of adult elk and magnitude of predation were the most influential factors controlling elk population size following a simulated wolf reintroduction. In particular, the form of density dependence in natural survival rates and the per-capita predation rate together accounted for over 90% of variation in simulated elk population trends. Additional research on wolf predation rates on elk and natural compensations in prey populations is needed to reliably predict the outcome of predatora??prey system behavior following wolf reintroductions.

  13. Assessing and mapping spatial associations among oral cancer mortality rates, concentrations of heavy metals in soil, and land use types based on multiple scale data.

    PubMed

    Lin, Wei-Chih; Lin, Yu-Pin; Wang, Yung-Chieh; Chang, Tsun-Kuo; Chiang, Li-Chi

    2014-02-21

    In this study, a deconvolution procedure was used to create a variogram of oral cancer (OC) rates. Based on the variogram, area-to-point (ATP) Poisson kriging and p-field simulation were used to downscale and simulate, respectively, the OC rate data for Taiwan from the district scale to a 1 km × 1 km grid scale. Local cluster analysis (LCA) of OC mortality rates was then performed to identify OC mortality rate hot spots based on the downscaled and the p-field-simulated OC mortality maps. The relationship between OC mortality and land use was studied by overlapping the maps of the downscaled OC mortality, the LCA results, and the land uses. One thousand simulations were performed to quantify local and spatial uncertainties in the LCA to identify OC mortality hot spots. The scatter plots and Spearman's rank correlation yielded the relationship between OC mortality and concentrations of the seven metals in the 1 km cell grid. The correlation analysis results for the 1 km scale revealed a weak correlation between OC mortality rate and concentrations of the seven studied heavy metals in soil. Accordingly, the heavy metal concentrations in soil are not major determinants of OC mortality rates at the 1 km scale at which soils were sampled. The LCA statistical results for local indicator of spatial association (LISA) revealed that the sites with high probability of high-high (high value surrounded by high values) OC mortality at the 1 km grid scale were clustered in southern, eastern, and mid-western Taiwan. The number of such sites was also significantly higher on agricultural land and in urban regions than on land with other uses. The proposed approach can be used to downscale and evaluate uncertainty in mortality data from a coarse scale to a fine scale at which useful additional information can be obtained for assessing and managing land use and risk.

  14. Weichselian permafrost depth in the Netherlands: a comprehensive uncertainty and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Govaerts, Joan; Beerten, Koen; ten Veen, Johan

    2016-11-01

    The Rupelian clay in the Netherlands is currently the subject of a feasibility study with respect to the storage of radioactive waste in the Netherlands (OPERA-project). Many features need to be considered in the assessment of the long-term evolution of the natural environment surrounding a geological waste disposal facility. One of these is permafrost development as it may have an impact on various components of the disposal system, including the natural environment (hydrogeology), the natural barrier (clay) and the engineered barrier. Determining how deep permafrost might develop in the future is desirable in order to properly address the possible impact on the various components. It is expected that periglacial conditions will reappear at some point during the next several hundred thousands of years, a typical time frame considered in geological waste disposal feasibility studies. In this study, the Weichselian glaciation is used as an analogue for future permafrost development. Permafrost depth modelling using a best estimate temperature curve of the Weichselian indicates that permafrost would reach depths between 155 and 195 m. Without imposing a climatic gradient over the country, deepest permafrost is expected in the south due to the lower geothermal heat flux and higher average sand content of the post-Rupelian overburden. Accounting for various sources of uncertainty, such as type and impact of vegetation, snow cover, surface temperature gradients across the country, possible errors in palaeoclimate reconstructions, porosity, lithology and geothermal heat flux, stochastic calculations point out that permafrost depth during the coldest stages of a glacial cycle such as the Weichselian, for any location in the Netherlands, would be 130-210 m at the 2σ level. In any case, permafrost would not reach depths greater than 270 m. The most sensitive parameters in permafrost development are the mean annual air temperatures and porosity, while the geothermal heat flux is the crucial parameter in permafrost degradation once temperatures start rising again.

  15. An exercise intervention to prevent falls in Parkinson’s: an economic evaluation

    PubMed Central

    2012-01-01

    Background People with Parkinson’s (PwP) experience frequent and recurrent falls. As these falls may have devastating consequences, there is an urgent need to identify cost-effective interventions with the potential to reduce falls in PwP. The purpose of this economic evaluation is to compare the costs and cost-effectiveness of a targeted exercise programme versus usual care for PwP who were at risk of falling. Methods One hundred and thirty participants were recruited through specialist clinics, primary care and Parkinson’s support groups and randomised to either an exercise intervention or usual care. Health and social care utilisation and health-related quality of life (EQ-5D) were assessed over the 20 weeks of the study (ten-week intervention period and ten-week follow up period), and these data were complete for 93 participants. Incremental cost per quality adjusted life year (QALY) was estimated. The uncertainty around costs and QALYs was represented using cost-effectiveness acceptability curves. Results The mean cost of the intervention was £76 per participant. Although in direction of favour of exercise intervention, there was no statistically significant differences between groups in total healthcare (−£128, 95% CI: -734 to 478), combined health and social care costs (£-35, 95% CI: -817 to 746) or QALYs (0.03, 95% CI: -0.02 to 0.03) at 20 weeks. Nevertheless, exploration of the uncertainty surrounding these estimates suggests there is more than 80% probability that the exercise intervention is a cost-effective strategy relative to usual care. Conclusion Whilst we found no difference between groups in total healthcare, total social care cost and QALYs, analyses indicate that there is high probability that the exercise intervention is cost-effective compared with usual care. These results require confirmation by larger trial-based economic evaluations and over the longer term. PMID:23176532

  16. Structural Limitations of Model Reference Adaptive Controllers

    DTIC Science & Technology

    1989-04-01

    Global Uncertainty CkpVps)I4(s) kWVh(s) In [3) a design rule similar the one studied heme Dps(ms+Cs)V~)Ds = s (4) (ectly the samne when n-m--l) was...Ir represent the under the uncertainty indicated by ES and Eu. output of this structured singular value analysis, p: is an Defint 6: The Design

  17. Scientific Uncertainty in News Coverage of Cancer Research: Effects of Hedging on Scientists' and Journalists' Credibility

    ERIC Educational Resources Information Center

    Jensen, Jakob D.

    2008-01-01

    News reports of scientific research are rarely hedged; in other words, the reports do not contain caveats, limitations, or other indicators of scientific uncertainty. Some have suggested that hedging may influence news consumers' perceptions of scientists' and journalists' credibility (perceptions that may be related to support for scientific…

  18. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  19. Observational constraints indicate risk of drying in the Amazon basin.

    PubMed

    Shiogama, Hideo; Emori, Seita; Hanasaki, Naota; Abe, Manabu; Masutomi, Yuji; Takahashi, Kiyoshi; Nozawa, Toru

    2011-03-29

    Climate warming due to human activities will be accompanied by hydrological cycle changes. Economies, societies and ecosystems in South America are vulnerable to such water resource changes. Hence, water resource impact assessments for South America, and corresponding adaptation and mitigation policies, have attracted increased attention. However, substantial uncertainties remain in the current water resource assessments that are based on multiple coupled Atmosphere Ocean General Circulation models. This uncertainty varies from significant wetting to catastrophic drying. By applying a statistical method, we characterized the uncertainty and identified global-scale metrics for measuring the reliability of water resource assessments in South America. Here, we show that, although the ensemble mean assessment suggested wetting across most of South America, the observational constraints indicate a higher probability of drying in the Amazon basin. Thus, over-reliance on the consensus of models can lead to inappropriate decision making.

  20. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  1. Potential future land use threats to California's protected areas

    USGS Publications Warehouse

    Wilson, Tamara Sue; Sleeter, Benjamin Michael; Davis, Adam Wilkinson

    2015-01-01

    Increasing pressures from land use coupled with future changes in climate will present unique challenges for California’s protected areas. We assessed the potential for future land use conversion on land surrounding existing protected areas in California’s twelve ecoregions, utilizing annual, spatially explicit (250 m) scenario projections of land use for 2006–2100 based on the Intergovernmental Panel on Climate Change Special Report on Emission Scenarios to examine future changes in development, agriculture, and logging. We calculated a conversion threat index (CTI) for each unprotected pixel, combining land use conversion potential with proximity to protected area boundaries, in order to identify ecoregions and protected areas at greatest potential risk of proximal land conversion. Our results indicate that California’s Coast Range ecoregion had the highest CTI with competition for extractive logging placing the greatest demand on land in close proximity to existing protected areas. For more permanent land use conversions into agriculture and developed uses, our CTI results indicate that protected areas in the Central California Valley and Oak Woodlands are most vulnerable. Overall, the Eastern Cascades, Central California Valley, and Oak Woodlands ecoregions had the lowest areal percent of protected lands and highest conversion threat values. With limited resources and time, rapid, landscape-level analysis of potential land use threats can help quickly identify areas with higher conversion probability of future land use and potential changes to both habitat and potential ecosystem reserves. Given the broad range of future uncertainties, LULC projections are a useful tool allowing land managers to visualize alternative landscape futures, improve planning, and optimize management practices.

  2. Molecular Gas toward the Gemini OB1 Molecular Cloud Complex. II. CO Outflow Candidates with Possible WISE Associations

    NASA Astrophysics Data System (ADS)

    Li, Yingjie; Li, Fa-Cheng; Xu, Ye; Wang, Chen; Du, Xin-Yu; Yang, Wenjin; Yang, Ji

    2018-03-01

    We present a large-scale survey of CO outflows in the Gem OB1 molecular cloud complex and its surroundings, using the Purple Mountain Observatory Delingha 13.7 m telescope. A total of 198 outflow candidates were identified over a large area (∼58.5 square degrees), of which 193 are newly detected. Approximately 68% (134/198) are associated with the Gem OB1 molecular cloud complex, including clouds GGMC 1, GGMC 2, BFS 52, GGMC 3, and GGMC 4. Other regions studied are: the Local arm (Local Lynds, West Front), Swallow, Horn, and Remote cloud. Outflow candidates in GGMC 1, BFS 52, and Swallow are mainly located at ring-like or filamentary structures. To avoid excessive uncertainty in distant regions (≳3.8 kpc), we only estimated the physical parameters for clouds in the Gem OB1 molecular cloud complex and in the Local arm. In those clouds, the total kinetic energy and the energy injection rate of the identified outflow candidates are ≲1% and ≲3% of the turbulent energy and the turbulent dissipation rate of each cloud, indicating that the identified outflow candidates cannot provide enough energy to balance turbulence of their host cloud at the scale of the entire cloud (several to dozens of parsecs). The gravitational binding energy of each cloud is ≳135 times the total kinetic energy of the identified outflow candidates within the corresponding cloud, indicating that the identified outflow candidates cannot cause major disruptions to the integrity of their host cloud at the scale of the entire cloud.

  3. Physiological characteristics of elite soccer players.

    PubMed

    Tumilty, D

    1993-08-01

    Soccer is one of the most popular sports in the world. There is still much uncertainty and debate surrounding its physiological requirements because emphasis is on skills to the neglect of fitness, conservative training methods and the difficulty of studying the sport scientifically. The frequently found values for total distance covered in a game of about 10 km and an above-average, though not outstanding, maximum oxygen uptake of 60 ml/kg/min suggest a moderate overall aerobic demand. A comparison of top teams and players with less able participants indicates that the components of anaerobic fitness-speed, power, strength and the capacity of the lactic acid system may differentiate better between the 2 groups. Generally, there is a reduction in the level of activity in the second half of games compared with the first. There is some evidence that increased aerobic fitness may help counteract this. Progressively lower muscle glycogen stores are one likely cause of reduction in activity, and nutrition also appears to be a key factor in minimising performance deterioration, both in terms of overall diet and, more particularly, the ingestion of carbohydrates immediately before, during and after a game. There are evolutionary trends in the sport such as greater frequency of games, changes in the roles of players, and new strategies and tactics which are placing increasing demands on the all-round fitness of players. Many studies indicate scope for improvement in player fitness. The challenge for coaches and players is to meet these fitness requirements without sacrificing the skill work which makes the sport unique.

  4. Tweeting nano: how public discourses about nanotechnology develop in social media environments

    NASA Astrophysics Data System (ADS)

    Runge, Kristin K.; Yeo, Sara K.; Cacciatore, Michael; Scheufele, Dietram A.; Brossard, Dominique; Xenos, Michael; Anderson, Ashley; Choi, Doo-hun; Kim, Jiyoun; Li, Nan; Liang, Xuan; Stubbings, Maria; Su, Leona Yi-Fan

    2013-01-01

    The growing popularity of social media as a channel for distributing and debating scientific information raises questions about the types of discourse that surround emerging technologies, such as nanotechnology, in online environments, as well as the different forms of information that audiences encounter when they use these online tools of information sharing. This study maps the landscape surrounding social media traffic about nanotechnology. Specifically, we use computational linguistic software to analyze a census of all English-language nanotechnology-related tweets expressing opinions posted on Twitter between September 1, 2010 and August 31, 2011. Results show that 55 % of tweets expressed certainty and 45 % expressed uncertainty. Twenty-seven percent of tweets expressed optimistic outlooks, 32 % expressed neutral outlooks and 41 % expressed pessimistic outlooks. Tweets were mapped by U.S. state, and our data show that tweets are more likely to originate from states with a federally funded National Nanotechnology Initiative center or network. The trend toward certainty in opinion coupled with the distinct geographic origins of much of the social media traffic on Twitter for nanotechnology-related opinion has significant implications for understanding how key online influencers are debating and positioning the issue of nanotechnology for lay and policy audiences.

  5. Behavioural thermoregulation and the relative roles of convection and radiation in a basking butterfly.

    PubMed

    Barton, Madeleine; Porter, Warren; Kearney, Michael

    2014-04-01

    Poikilothermic animals are often reliant on behavioural thermoregulation to elevate core-body temperature above the temperature of their surroundings. Butterflies are able to do this by altering body posture and location while basking, however the specific mechanisms that achieve such regulation vary among species. The role of the wings has been particularly difficult to describe, with uncertainty surrounding whether they are positioned to reduce convective heat loss or to maximise heat gained through radiation. Characterisation of the extent to which these processes affect core-body temperature will provide insights into the way in which a species׳ thermal sensitivity and morphological traits have evolved. We conducted field and laboratory measurements to assess how basking posture affects the core-body temperature of an Australian butterfly, the common brown (Heteronympha merope). We show that, with wings held open, heat lost through convection is reduced while heat gained through radiation is simultaneously maximised. These responses have been incorporated into a biophysical model that accurately predicts the core-body temperature of basking specimens in the field, providing a powerful tool to explore how climate constrains the distribution and abundance of basking butterflies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. The Sensitivity of Orographic Precipitation to Flow Direction

    NASA Astrophysics Data System (ADS)

    Mass, C.; Picard, L.

    2015-12-01

    An area of substantial interest is the sensitivity of orographic precipitation to the characteristics of the incoming flow and to the surrounding environment. Some studies have suggested substantial sensitivity of precipitation within individual river drainages for relatively small directional or stability variations of incoming flow. A characterization of such flow sensitivity would be of great value for hydrometeorological prediction, the determination of Probable Maximum Precipitation statistics, and for quantifying the uncertainty in precipitation and hydrological forecasts. To gain insight into this problem, an idealized version of the Weather Research and Forecasting (WRF) modeling system was created in which simulations are driven by a single vertical sounding, with the assumption of thermal wind balance. The actual terrain is used and the full physics complement of the modeling system. The presentation will show how precipitation over the Olympic Mountains of Washington State varies as flow direction changes. This analysis will include both the aggregate precipitation over the barrier and the precipitation within individual drainages or areas. The role of surrounding terrain and the nearby coastline are also examined by removing these features from simulations. Finally, the impact of varying flow stability and speed on the precipitation over this orographic feature will be described.

  7. Statistical evaluation of the influence of the uncertainty budget on B-spline curve approximation

    NASA Astrophysics Data System (ADS)

    Zhao, Xin; Alkhatib, Hamza; Kargoll, Boris; Neumann, Ingo

    2017-12-01

    In the field of engineering geodesy, terrestrial laser scanning (TLS) has become a popular method for detecting deformations. This paper analyzes the influence of the uncertainty budget on free-form curves modeled by B-splines. Usually, free-form estimation is based on scanning points assumed to have equal accuracies, which is not realistic. Previous findings demonstrate that the residuals still contain random and systematic uncertainties caused by instrumental, object-related and atmospheric influences. In order to guarantee the quality of derived estimates, it is essential to be aware of all uncertainties and their impact on the estimation. In this paper, a more detailed uncertainty budget is considered, in the context of the "Guide to the Expression of Uncertainty in Measurement" (GUM), which leads to a refined, heteroskedastic variance covariance matrix (VCM) of TLS measurements. Furthermore, the control points of B-spline curves approximating a measured bridge are estimated. Comparisons are made between the estimated B-spline curves using on the one hand a homoskedastic VCM and on the other hand the refined VCM. To assess the statistical significance of the differences displayed by the estimates for the two stochastic models, a nested model misspecification test and a non-nested model selection test are described and applied. The test decisions indicate that the homoskedastic VCM should be replaced by a heteroskedastic VCM in the direction of the suggested VCM. However, the tests also indicate that the considered VCM is still inadequate in light of the given data set and should therefore be improved.

  8. The health risk levels of different age groups of residents living in the vicinity of municipal solid waste incinerator posed by PCDD/Fs in atmosphere and soil.

    PubMed

    Li, Jiafu; Zhang, Ying; Sun, Tingting; Hao, Huawei; Wu, Hao; Wang, Lili; Chen, Yuxing; Xing, Limin; Niu, Zhiguang

    2018-08-01

    In our study, health risk levels of different age groups of residents living in the vicinity of a municipal solid waste incinerator (MSWI) posed by polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) in atmosphere and soil were evaluated. The toxic equivalent concentrations of PCDD/Fs (TEQ) in surrounding atmosphere and soil of studied MSWI were 0.05-0.12 pg I-TEQ Nm -3 and 7.622-15.450 ng I-TEQ kg -1 , respectively. The PCDFs/PCDDs (F/D) values of PCDD/Fs in surrounding atmosphere of studied MSWI ranged from 0.40 to 5.90 with a mean of 1.80, suggesting that the PCDD/Fs mainly came from combustion sources and studied MSWI could be a key source of PCDD/Fs in surrounding atmosphere. The F/D ratios of PCDD/Fs in surrounding soil ranged from 0.18 to 1.81 with a mean of 0.90, suggesting combustion is not the mainly sources of PCDD/Fs in surrounding soil, and studied MSWI may have limited influence on PCDD/Fs in surrounding soil. O8CDD and 2,3,4,7,8-P5CDF could be the total PCDD/Fs and TEQ indicators in surrounding atmosphere of studied MSWI, respectively. The carcinogenic risk (CR) values of PCDD/Fs in surrounding atmosphere and soil for children, teens and adults were 1.24E-06, 9.06E-07 and 4.41E-06, respectively, suggesting that the potential cancer risk occurred but the risk was at acceptable levels for both children and adults (<1.00E-05), and the cancer risk for teens was negligible (<1.00E-06). The non-carcinogenic risk (non-CR) values of three age groups were lower than 1, indicating that no obvious non-carcinogenic effects occurred. Inhalation of air was the largest contributor of health risk (both CR and non-CR) for three age groups. In addition, a comparison of the health risk between PCDD/Fs and other emerging contaminants and traditional pollutants in soil and atmosphere was performed, which will help us have a good view of the health risk levels of PCDD/Fs in surrounding environment of MWSI. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models

    NASA Astrophysics Data System (ADS)

    Chang, Joseph C.

    This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)

  10. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  11. Uncertainty Quantification of Turbulence Model Closure Coefficients for Transonic Wall-Bounded Flows

    NASA Technical Reports Server (NTRS)

    Schaefer, John; West, Thomas; Hosder, Serhat; Rumsey, Christopher; Carlson, Jan-Renee; Kleb, William

    2015-01-01

    The goal of this work was to quantify the uncertainty and sensitivity of commonly used turbulence models in Reynolds-Averaged Navier-Stokes codes due to uncertainty in the values of closure coefficients for transonic, wall-bounded flows and to rank the contribution of each coefficient to uncertainty in various output flow quantities of interest. Specifically, uncertainty quantification of turbulence model closure coefficients was performed for transonic flow over an axisymmetric bump at zero degrees angle of attack and the RAE 2822 transonic airfoil at a lift coefficient of 0.744. Three turbulence models were considered: the Spalart-Allmaras Model, Wilcox (2006) k-w Model, and the Menter Shear-Stress Trans- port Model. The FUN3D code developed by NASA Langley Research Center was used as the flow solver. The uncertainty quantification analysis employed stochastic expansions based on non-intrusive polynomial chaos as an efficient means of uncertainty propagation. Several integrated and point-quantities are considered as uncertain outputs for both CFD problems. All closure coefficients were treated as epistemic uncertain variables represented with intervals. Sobol indices were used to rank the relative contributions of each closure coefficient to the total uncertainty in the output quantities of interest. This study identified a number of closure coefficients for each turbulence model for which more information will reduce the amount of uncertainty in the output significantly for transonic, wall-bounded flows.

  12. Ion Composition of Comet 19P/Borrelly as Measured by the PEPE Ion Mass Spectrometer on DS1

    NASA Astrophysics Data System (ADS)

    Nordholt, J. E.; Reisenfeld, D. B.; Wiens, R. C.; Gary, P.

    2002-12-01

    Cometary compositions are of great interest because they hold important clues to the formation of the outer solar system, and to the sources of volatiles in the solar system, including the terrestrial planets. In order to understand the primordial compositions of cometary nuclei, it is important to also understand their evolution, as many of the comets most accessible to spacecraft are highly evolved. It is also important to understand the ion and neutral chemistry that occurs in the coma surrounding the nucleus if the coma ion composition is to be used to determine the original composition of the nucleus. Deep Space One (DS1) was only the second spacecraft, after Giotto, to use an ion mass-resolving instrument to explore cometary coma compositions in-situ, which it did during the flyby of Comet Borrelly on September 22, 2001. Borrelly is significantly more evolved than Halley. In addition, the encounter occurred at a significantly greater distance from the sun (1.36 AU vs 0.9 AU for Giotto at Halley). The Plasma Experiment for Planetary Exploration (PEPE) on board DS1 was capable of resolving electron and ion energy, angle of incidence, and ion mass composition. The PEPE ion data from the seven minutes surrounding closest approach (2171 km) have been extensively analyzed. The instrument response was modeled using SIMION and TRIM codes for all of the major species through 20 AMU plus CO (at its operating voltage PEPE was very insensitive to heavier molecules). Chi-squared minimization analysis is being carried out to determine the best fit and the uncertainties. Preliminary results for the predominant heavy ions are OH+ at (72 +/- 9)% of the total water-group ion density, H2O+ at (25 +/- 7)%, CH3+ at (5 +/- 3)%, and O+ at (4 +/- 5)%. Uncertainties are quoted at the 90% confidence level. Comparison with reported Halley compositions from Giotto shows that Borrelly clearly has a lower H3O+ abundance (< 9%), consistent with a more evolved comet. The presence of relatively high amounts of CH3+, proposed in the context of Halley to be produced by protonation of CH2+, is somewhat surprising in this context. Because the H3O+/H2O+ ratio is an indicator of the degree of protonation in the coma, a low H3O+/H2O+ ratio would predict a low CH3+/CH2+ ratio as well. However, this is not the case at Borrelly. The CH3+/H3O+ ratio will need further study in future comet models and observations.

  13. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  14. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  15. Incorporating uncertainty and motion in Intensity Modulated Radiation Therapy treatment planning

    NASA Astrophysics Data System (ADS)

    Martin, Benjamin Charles

    In radiation therapy, one seeks to destroy a tumor while minimizing the damage to surrounding healthy tissue. Intensity Modulated Radiation Therapy (IMRT) uses overlapping beams of x-rays that add up to a high dose within the target and a lower dose in the surrounding healthy tissue. IMRT relies on optimization techniques to create high quality treatments. Unfortunately, the possible conformality is limited by the need to ensure coverage even if there is organ movement or deformation. Currently, margins are added around the tumor to ensure coverage based on an assumed motion range. This approach does not ensure high quality treatments. In the standard IMRT optimization problem, an objective function measures the deviation of the dose from the clinical goals. The optimization then finds the beamlet intensities that minimize the objective function. When modeling uncertainty, the dose delivered from a given set of beamlet intensities is a random variable. Thus the objective function is also a random variable. In our stochastic formulation we minimize the expected value of this objective function. We developed a problem formulation that is both flexible and fast enough for use on real clinical cases. While working on accelerating the stochastic optimization, we developed a technique of voxel sampling. Voxel sampling is a randomized algorithms approach to a steepest descent problem based on estimating the gradient by only calculating the dose to a fraction of the voxels within the patient. When combined with an automatic sampling rate adaptation technique, voxel sampling produced an order of magnitude speed up in IMRT optimization. We also develop extensions of our results to Intensity Modulated Proton Therapy (IMPT). Due to the physics of proton beams the stochastic formulation yields visibly different and better plans than normal optimization. The results of our research have been incorporated into a software package OPT4D, which is an IMRT and IMPT optimization tool that we developed.

  16. Expected geoneutrino signal at JUNO

    NASA Astrophysics Data System (ADS)

    Strati, Virginia; Baldoncini, Marica; Callegari, Ivan; Mantovani, Fabio; McDonough, William F.; Ricci, Barbara; Xhixha, Gerti

    2015-12-01

    Constraints on the Earth's composition and on its radiogenic energy budget come from the detection of geoneutrinos. The Kamioka Liquid scintillator Antineutrino Detector (KamLAND) and Borexino experiments recently reported the geoneutrino flux, which reflects the amount and distribution of U and Th inside the Earth. The Jiangmen Underground Neutrino Observatory (JUNO) neutrino experiment, designed as a 20 kton liquid scintillator detector, will be built in an underground laboratory in South China about 53 km from the Yangjiang and Taishan nuclear power plants, each one having a planned thermal power of approximately 18 GW. Given the large detector mass and the intense reactor antineutrino flux, JUNO aims not only to collect high statistics antineutrino signals from reactors but also to address the challenge of discriminating the geoneutrino signal from the reactor background. The predicted geoneutrino signal at JUNO is terrestrial neutrino unit (TNU), based on the existing reference Earth model, with the dominant source of uncertainty coming from the modeling of the compositional variability in the local upper crust that surrounds (out to approximately 500 km) the detector. A special focus is dedicated to the 6° × 4° local crust surrounding the detector which is estimated to contribute for the 44% of the signal. On the basis of a worldwide reference model for reactor antineutrinos, the ratio between reactor antineutrino and geoneutrino signals in the geoneutrino energy window is estimated to be 0.7 considering reactors operating in year 2013 and reaches a value of 8.9 by adding the contribution of the future nuclear power plants. In order to extract useful information about the mantle's composition, a refinement of the abundance and distribution of U and Th in the local crust is required, with particular attention to the geochemical characterization of the accessible upper crust where 47% of the expected geoneutrino signal originates and this region contributes the major source of uncertainty.

  17. VizieR Online Data Catalog: Circumgalactic medium surrounding z~2 quasars (Prochaska+, 2014)

    NASA Astrophysics Data System (ADS)

    Prochaska, J. X.; Lau, M. W.; Hennawi, J. F.

    2017-08-01

    The sample of quasar pairs analyzed here is a subset of the sample studied in QPQ6 (Cantalupo et al. 2014Natur.506...63C) for H I Lyα absorption. Specifically, we have restricted the current study to those pairs where the signal-to-noise ratio (S/N) at H I Lyα exceeds 9.5 per rest-frame Å. This facilitates a more precise evaluation of H I Lyα and generally insures sufficient S/N redward of Lyα for the metal-line analysis. Quasar emission redshifts are taken directly from QPQ6 (Cantalupo et al. 2014Natur.506...63C), following the methodology described in that manuscript. Briefly, we adopt a custom line-centering algorithm to centroid one or more far-UV emission lines and adopt the analysis of Shen et al. (2007, J/AJ/133/2222) to combine these measurements and assess systematic uncertainty in the final value. The median emission redshift of the 427 pairs is zemmedian=2.35 and the median uncertainty in the redshift measurements is ~520 km/s. The impact parameters range from R{perp}=39 kpc to 1 Mpc, with 52 pairs having R{perp}<200 kpc. (3 data files).

  18. Provider judgments of patients in pain: seeking symptom certainty.

    PubMed

    Tait, Raymond C; Chibnall, John T; Kalauokalani, Donna

    2009-01-01

    Uncertainty often surrounds judgments of pain, especially when pain is chronic. In order to simplify their decisions, providers adduce information from a variety of sources. Unfortunately, an extensive literature suggests that the information that is brought to bear actually can bias pain judgments, resulting in judgments that consistently differ from patient reports, with a potential negative impact on treatment. This review examines the pain assessment literature from a social cognition perspective that emphasizes interpersonal and situational factors that can influence judgments. Consistent with that model, it organizes research findings into three broad domains that have been shown to systematically influence assessments of pain, involving patient, provider, and situational factors. A causal model for pain judgment is proposed, and its implications for clinical research and practice are explored. In order to minimize the uncertainty that can characterize symptoms such as chronic pain, practitioners bring information to bear on pain assessment that can lead to misjudgments. While intuitively appealing, much of the information that is considered often has little association with pain severity and/or adjustment. A more rational decision-making process can reduce the judgment errors common to pain assessment and treatment.

  19. A QUANTITATIVE TEST OF THE NO-HAIR THEOREM WITH Sgr A* USING STARS, PULSARS, AND THE EVENT HORIZON TELESCOPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Psaltis, Dimitrios; Wex, Norbert; Kramer, Michael

    The black hole in the center of the Milky Way, Sgr A*, has the largest mass-to-distance ratio among all known black holes in the universe. This property makes Sgr A* the optimal target for testing the gravitational no-hair theorem. In the near future, major developments in instrumentation will provide the tools for high-precision studies of its spacetime via observations of relativistic effects in stellar orbits, in the timing of pulsars, and in horizon-scale images of its accretion flow. We explore here the prospect of measuring the properties of the black hole spacetime using all of these three types of observations.more » We show that the correlated uncertainties in the measurements of the black hole spin and quadrupole moment using the orbits of stars and pulsars are nearly orthogonal to those obtained from measuring the shape and size of the shadow the black hole casts on the surrounding emission. Combining these three types of observations will therefore allow us to assess and quantify systematic biases and uncertainties in each measurement and lead to a highly accurate, quantitative test of the gravitational no-hair theorem.« less

  20. Vitamin D: Moving Forward to Address Emerging Science

    PubMed Central

    Sempos, Christopher T.; Davis, Cindy D.; Brannon, Patsy M.

    2017-01-01

    The science surrounding vitamin D presents both challenges and opportunities. Although many uncertainties are associated with the understandings concerning vitamin D, including its physiological function, the effects of excessive intake, and its role in health, it is at the same time a major interest in the research and health communities. The approach to evaluating and interpreting the available evidence about vitamin D should be founded on the quality of the data and on the conclusions that take into account the totality of the evidence. In addition, these activities can be used to identify critical data gaps and to help structure future research. The Office of Dietary Supplements (ODS) at the National Institutes of Health has as part of its mission the goal of supporting research and dialogues for topics with uncertain data, including vitamin D. This review considers vitamin D in the context of systematically addressing the uncertainty and in identifying research needs through the filter of the work of ODS. The focus includes the role of systematic reviews, activities that encompass considerations of the totality of the evidence, and collaborative activities to clarify unknowns or to fix methodological problems, as well as a case study using the relationship between cancer and vitamin D. PMID:29194368

  1. Priorities for autism spectrum disorder risk communication and ethics.

    PubMed

    Yudell, Michael; Tabor, Holly K; Dawson, Geraldine; Rossi, John; Newschaffer, Craig

    2013-11-01

    Autism spectrum disorders are an issue of increasing public health significance. The incidence of autism spectrum disorders has been increasing in recent years, and they are associated with significant personal and financial impacts for affected persons and their families. In recent years, a large number of scientific studies have been undertaken, which investigate genetic and environmental risk factors for autism, with more studies underway. At present, much remains unknown regarding autism spectrum disorder risk factors, but the emerging picture of causation is in many cases complex, with multiple genes and gene-environment interactions being at play. The complexity and uncertainty surrounding autism spectrum disorder risk factors raise a number of questions regarding the ethical considerations that should be taken into account when undertaking autism spectrum disorder risk communication. At present, however, little has been written regarding autism spectrum disorder risk communication and ethics. This article summarizes the findings of a recent conference investigating ethical considerations and policy recommendations in autism spectrum disorder risk communication, which to the authors' knowledge is the first of its kind. Here, the authors discuss a number of issues, including uncertainty; comprehension; inadvertent harm; justice; and the appropriate roles of clinicians, scientists, and the media in autism spectrum disorder risk communication.

  2. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  3. Analysis of the uncertainty in the monetary valuation of ecosystem services--A case study at the river basin scale.

    PubMed

    Boithias, Laurie; Terrado, Marta; Corominas, Lluís; Ziv, Guy; Kumar, Vikas; Marqués, Montse; Schuhmacher, Marta; Acuña, Vicenç

    2016-02-01

    Ecosystem services provide multiple benefits to human wellbeing and are increasingly considered by policy-makers in environmental management. However, the uncertainty related with the monetary valuation of these benefits is not yet adequately defined or integrated by policy-makers. Given this background, our aim was to quantify different sources of uncertainty when performing monetary valuation of ecosystem services, in order to provide a series of guidelines to reduce them. With an example of 4 ecosystem services (i.e., water provisioning, waste treatment, erosion protection, and habitat for species) provided at the river basin scale, we quantified the uncertainty associated with the following sources: (1) the number of services considered, (2) the number of benefits considered for each service, (3) the valuation metrics (i.e. valuation methods) used to value benefits, and (4) the uncertainty of the parameters included in the valuation metrics. Results indicate that the highest uncertainty was caused by the number of services considered, as well as by the number of benefits considered for each service, whereas the parametric uncertainty was similar to the one related to the selection of valuation metric, thus suggesting that the parametric uncertainty, which is the only uncertainty type commonly considered, was less critical than the structural uncertainty, which is in turn mainly dependent on the decision-making context. Given the uncertainty associated to the valuation structure, special attention should be given to the selection of services, benefits and metrics according to a given context. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. The critical role of uncertainty in projections of hydrological extremes

    NASA Astrophysics Data System (ADS)

    Meresa, Hadush K.; Romanowicz, Renata J.

    2017-08-01

    This paper aims to quantify the uncertainty in projections of future hydrological extremes in the Biala Tarnowska River at Koszyce gauging station, south Poland. The approach followed is based on several climate projections obtained from the EURO-CORDEX initiative, raw and bias-corrected realizations of catchment precipitation, and flow simulations derived using multiple hydrological model parameter sets. The projections cover the 21st century. Three sources of uncertainty are considered: one related to climate projection ensemble spread, the second related to the uncertainty in hydrological model parameters and the third related to the error in fitting theoretical distribution models to annual extreme flow series. The uncertainty of projected extreme indices related to hydrological model parameters was conditioned on flow observations from the reference period using the generalized likelihood uncertainty estimation (GLUE) approach, with separate criteria for high- and low-flow extremes. Extreme (low and high) flow quantiles were estimated using the generalized extreme value (GEV) distribution at different return periods and were based on two different lengths of the flow time series. A sensitivity analysis based on the analysis of variance (ANOVA) shows that the uncertainty introduced by the hydrological model parameters can be larger than the climate model variability and the distribution fit uncertainty for the low-flow extremes whilst for the high-flow extremes higher uncertainty is observed from climate models than from hydrological parameter and distribution fit uncertainties. This implies that ignoring one of the three uncertainty sources may cause great risk to future hydrological extreme adaptations and water resource planning and management.

  5. Funding the unfundable: mechanisms for managing uncertainty in decisions on the introduction of new and innovative technologies into healthcare systems.

    PubMed

    Stafinski, Tania; McCabe, Christopher J; Menon, Devidas

    2010-01-01

    As tensions between payers, responsible for ensuring prudent and principled use of scarce resources, and both providers and patients, who legitimately want access to technologies from which they could benefit, continue to mount, interest in approaches to managing the uncertainty surrounding the introduction of new health technologies has heightened. The purpose of this project was to compile an inventory of various types of 'access with evidence development' (AED) schemes, examining characteristics of the technologies to which they have been applied, the uncertainty they sought to address, the terms of arrangements of each scheme, and the policy outcomes. It also aimed to identify issues related to such schemes, including advantages and disadvantages from the perspectives of various stakeholder groups. A comprehensive search, review and appraisal of peer-reviewed and 'grey' literature were performed, followed by a facilitated workshop of academics and decision makers with expertise in AED schemes. Information was extracted and compiled in tabular form to identify patterns or trends. To enhance the validity of interpretations made, member checking was performed. Although the concept of AED is not new, evaluative data are sparse. Despite varying opinions on the 'right' answers to some of the questions raised, there appears to be consensus on a 'way forward'--development of methodological guidelines. All stakeholders seemed to share the view that AEDs offer the potential to facilitate patient access to promising new technologies and encourage innovation while ensuring effective use of scarce healthcare resources. There is no agreement on what constitutes 'sufficient evidence', and it depends on the specific uncertainty in question. There is agreement on the need for 'best practice' guidelines around the implementation and evaluation of AED schemes. This is the first attempt at a comprehensive analysis of methods that have been used to address uncertainty concerning a new drug or other technology. The analysis reveals that, although various approaches have been experimented with, many of them have not achieved the ostensible goal of the approach. This article outlines challenges related to AED schemes and issues that remain unresolved.

  6. Advanced Stochastic Collocation Methods for Polynomial Chaos in RAVEN

    NASA Astrophysics Data System (ADS)

    Talbot, Paul W.

    As experiment complexity in fields such as nuclear engineering continually increases, so does the demand for robust computational methods to simulate them. In many simulations, input design parameters and intrinsic experiment properties are sources of uncertainty. Often small perturbations in uncertain parameters have significant impact on the experiment outcome. For instance, in nuclear fuel performance, small changes in fuel thermal conductivity can greatly affect maximum stress on the surrounding cladding. The difficulty quantifying input uncertainty impact in such systems has grown with the complexity of numerical models. Traditionally, uncertainty quantification has been approached using random sampling methods like Monte Carlo. For some models, the input parametric space and corresponding response output space is sufficiently explored with few low-cost calculations. For other models, it is computationally costly to obtain good understanding of the output space. To combat the expense of random sampling, this research explores the possibilities of using advanced methods in Stochastic Collocation for generalized Polynomial Chaos (SCgPC) as an alternative to traditional uncertainty quantification techniques such as Monte Carlo (MC) and Latin Hypercube Sampling (LHS) methods for applications in nuclear engineering. We consider traditional SCgPC construction strategies as well as truncated polynomial spaces using Total Degree and Hyperbolic Cross constructions. We also consider applying anisotropy (unequal treatment of different dimensions) to the polynomial space, and offer methods whereby optimal levels of anisotropy can be approximated. We contribute development to existing adaptive polynomial construction strategies. Finally, we consider High-Dimensional Model Reduction (HDMR) expansions, using SCgPC representations for the subspace terms, and contribute new adaptive methods to construct them. We apply these methods on a series of models of increasing complexity. We use analytic models of various levels of complexity, then demonstrate performance on two engineering-scale problems: a single-physics nuclear reactor neutronics problem, and a multiphysics fuel cell problem coupling fuels performance and neutronics. Lastly, we demonstrate sensitivity analysis for a time-dependent fuels performance problem. We demonstrate the application of all the algorithms in RAVEN, a production-level uncertainty quantification framework.

  7. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  8. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  9. Dynamic rating curve assessment for hydrometric stations and computation of the associated uncertainties: Quality and station management indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan

    2014-09-01

    A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.

  10. Evaluation of a Real-Time Monitoring System for River Quality-A Trade-off between Risk Attitudes, Costs, and Uncertainly.

    ERIC Educational Resources Information Center

    Varis, Olli; And Others

    1993-01-01

    Presents one approach to handling the trade-off between reducing uncertainty in environmental assessment and management and additional expenses. Uses the approach in the evaluation of three alternatives for a real time river water quality forecasting system. Analysis of risk attitudes, costs and uncertainty indicated the levels of socioeconomic…

  11. Adolescents display distinctive tolerance to ambiguity and to uncertainty during risky decision making

    PubMed Central

    van den Bos, Wouter; Hertwig, Ralph

    2017-01-01

    Although actuarial data indicate that risk-taking behavior peaks in adolescence, laboratory evidence for this developmental spike remains scarce. One possible explanation for this incongruity is that in the real world adolescents often have only vague information about the potential consequences of their behavior and the likelihoods of those consequences, whereas in the lab these are often clearly stated. How do adolescents behave under such more realistic conditions of ambiguity and uncertainty? We asked 105 participants aged from 8 to 22 years to make three types of choices: (1) choices between options whose possible outcomes and probabilities were fully described (choices under risk); (2) choices between options whose possible outcomes were described but whose probability information was incomplete (choices under ambiguity), and (3) choices between unknown options whose possible outcomes and probabilities could be explored (choices under uncertainty). Relative to children and adults, two adolescent-specific markers emerged. First, adolescents were more accepting of ambiguity; second, they were also more accepting of uncertainty (as indicated by shorter pre-decisional search). Furthermore, this tolerance of the unknown was associated with motivational, but not cognitive, factors. These findings offer novel insights into the psychology of adolescent risk taking. PMID:28098227

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Namhata, Argha; Oladyshkin, Sergey; Dilmore, Rober

    Carbon dioxide (CO2) storage into geological formations is regarded as an important mitigation strategy for anthropogenic CO2 emissions to the atmosphere. This study first simulates the leakage of CO2 and brine from a storage reservoir through the caprock. Then, we estimate the resulting pressure changes at the zone overlying the caprock also known as Above Zone Monitoring Interval (AZMI). A data-driven approach of arbitrary Polynomial Chaos (aPC) Expansion is then used to quantify the uncertainty in the above zone pressure prediction based on the uncertainties in different geologic parameters. Finally, a global sensitivity analysis is performed with Sobol indices basedmore » on the aPC technique to determine the relative importance of different parameters on pressure prediction. The results indicate that there can be uncertainty in pressure prediction locally around the leakage zones. The degree of such uncertainty in prediction depends on the quality of site specific information available for analysis. The scientific results from this study provide substantial insight that there is a need for site-specific data for efficient predictions of risks associated with storage activities. The presented approach can provide a basis of optimized pressure based monitoring network design at carbon storage sites.« less

  13. Probabilistic Assessment of Above Zone Pressure Predictions at a Geologic Carbon Storage Site

    PubMed Central

    Namhata, Argha; Oladyshkin, Sergey; Dilmore, Robert M.; Zhang, Liwei; Nakles, David V.

    2016-01-01

    Carbon dioxide (CO2) storage into geological formations is regarded as an important mitigation strategy for anthropogenic CO2 emissions to the atmosphere. This study first simulates the leakage of CO2 and brine from a storage reservoir through the caprock. Then, we estimate the resulting pressure changes at the zone overlying the caprock also known as Above Zone Monitoring Interval (AZMI). A data-driven approach of arbitrary Polynomial Chaos (aPC) Expansion is then used to quantify the uncertainty in the above zone pressure prediction based on the uncertainties in different geologic parameters. Finally, a global sensitivity analysis is performed with Sobol indices based on the aPC technique to determine the relative importance of different parameters on pressure prediction. The results indicate that there can be uncertainty in pressure prediction locally around the leakage zones. The degree of such uncertainty in prediction depends on the quality of site specific information available for analysis. The scientific results from this study provide substantial insight that there is a need for site-specific data for efficient predictions of risks associated with storage activities. The presented approach can provide a basis of optimized pressure based monitoring network design at carbon storage sites. PMID:27996043

  14. Probabilistic Assessment of Above Zone Pressure Predictions at a Geologic Carbon Storage Site

    NASA Astrophysics Data System (ADS)

    Namhata, Argha; Oladyshkin, Sergey; Dilmore, Robert M.; Zhang, Liwei; Nakles, David V.

    2016-12-01

    Carbon dioxide (CO2) storage into geological formations is regarded as an important mitigation strategy for anthropogenic CO2 emissions to the atmosphere. This study first simulates the leakage of CO2 and brine from a storage reservoir through the caprock. Then, we estimate the resulting pressure changes at the zone overlying the caprock also known as Above Zone Monitoring Interval (AZMI). A data-driven approach of arbitrary Polynomial Chaos (aPC) Expansion is then used to quantify the uncertainty in the above zone pressure prediction based on the uncertainties in different geologic parameters. Finally, a global sensitivity analysis is performed with Sobol indices based on the aPC technique to determine the relative importance of different parameters on pressure prediction. The results indicate that there can be uncertainty in pressure prediction locally around the leakage zones. The degree of such uncertainty in prediction depends on the quality of site specific information available for analysis. The scientific results from this study provide substantial insight that there is a need for site-specific data for efficient predictions of risks associated with storage activities. The presented approach can provide a basis of optimized pressure based monitoring network design at carbon storage sites.

  15. Inclusion of Minority Patients in Breast Cancer Clinical Trials: The Role of the Clinical Trial Environment

    DTIC Science & Technology

    2008-05-01

    trials, site cultural competence, and outreach efforts. We will also examine the social and physical characteristics of the community surrounding the...clinical trial sites and those that address specific barriers associated with the social or physical environment. 2 Body This annual report...indicators. Data will be collected to characterize both the physical environment and the social environment surrounding clinical trials. The

  16. Application of Dynamic naïve Bayesian classifier to comprehensive drought assessment

    NASA Astrophysics Data System (ADS)

    Park, D. H.; Lee, J. Y.; Lee, J. H.; KIm, T. W.

    2017-12-01

    Drought monitoring has already been extensively studied due to the widespread impacts and complex causes of drought. The most important component of drought monitoring is to estimate the characteristics and extent of drought by quantitatively measuring the characteristics of drought. Drought assessment considering different aspects of the complicated drought condition and uncertainty of drought index is great significance in accurate drought monitoring. This study used the dynamic Naïve Bayesian Classifier (DNBC) which is an extension of the Hidden Markov Model (HMM), to model and classify drought by using various drought indices for integrated drought assessment. To provide a stable model for combined use of multiple drought indices, this study employed the DNBC to perform multi-index drought assessment by aggregating the effect of different type of drought and considering the inherent uncertainty. Drought classification was performed by the DNBC using several drought indices: Standardized Precipitation Index (SPI), Streamflow Drought Index (SDI), and Normalized Vegetation Supply Water Index (NVSWI)) that reflect meteorological, hydrological, and agricultural drought characteristics. Overall results showed that in comparison unidirectional (SPI, SDI, and NVSWI) or multivariate (Composite Drought Index, CDI) drought assessment, the proposed DNBC was able to synthetically classify of drought considering uncertainty. Model provided method for comprehensive drought assessment with combined use of different drought indices.

  17. SU-F-T-316: A Model to Deal with Dosimetric and Delivery Uncertainties in Radiotherapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haering, P; Lang, C; Splinter, M

    2016-06-15

    Purpose The conventional way of dealing with uncertainties resulting from dose calculation or beam delivery in IMRT, is to do verification measurements for the plan in question. Here we present an alternative based on recommendations given in the AAPM 142 report and treatment specific parameters that model the uncertainties for the plan delivery. Methods Basis of the model is the assignment of uncertainty parameters to all segment fields or control point sequences of a plan. The given field shape is analyzed for complexity, dose rate, number of MU, field size related output as well as factors for in/out field positionmore » and penumbra regions. Together with depth related uncertainties, a 3D matrix is generated by a projection algorithm. Patient anatomy is included as uncertainty CT data set as well. Therefore, object density is classified in 4 categories close to water, lung, bone and gradient regions with additional uncertainties. The result is then exported as a DICOM dose file by the software tool (written in IDL, Exelis), having the given resolution and target point. Results Uncertainty matrixes for several patient cases have been calculated and compared side by side in the planning system. The result is not quite always intuitive but it clearly indicates high and low uncertainties related to OARs and target volumes as well as to measured gamma distributions.ConclusionThe imported uncertainty datasets may help the treatment planner to understand the complexity of the treatment plan. He then might decide to change the plan to produce a more suited uncertainty distribution, e.g. by changing the beam angles the high uncertainty spots can be influenced or try to use another treatment setup, resulting in a plan with lower uncertainties. A next step could be to include such a model into the optimization algorithm to add a new dose uncertainty constraint.« less

  18. Uncertainty in Bohr's response to the Heisenberg microscope

    NASA Astrophysics Data System (ADS)

    Tanona, Scott

    2004-09-01

    In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.

  19. Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  20. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  1. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.

  2. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  3. Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newsom, Rob

    2016-03-01

    In March and April of 2015, the ARM Doppler lidar that was formerly operated at the Tropical Western Pacific site in Darwin, Australia (S/N 0710-08) was deployed to the Boulder Atmospheric Observatory (BAO) for the eXperimental Planetary boundary-layer Instrument Assessment (XPIA) field campaign. The goal of the XPIA field campaign was to investigate methods of using multiple Doppler lidars to obtain high-resolution three-dimensional measurements of winds and turbulence in the atmospheric boundary layer, and to characterize the uncertainties in these measurements. The ARM Doppler lidar was one of many Doppler lidar systems that participated in this study. During XPIA themore » 300-m tower at the BAO site was instrumented with well-calibrated sonic anemometers at six levels. These sonic anemometers provided highly accurate reference measurements against which the lidars could be compared. Thus, the deployment of the ARM Doppler lidar during XPIA offered a rare opportunity for the ARM program to characterize the uncertainties in their lidar wind measurements. Results of the lidar-tower comparison indicate that the lidar wind speed measurements are essentially unbiased (~1cm s-1), with a random error of approximately 50 cm s-1. Two methods of uncertainty estimation were tested. The first method was found to produce uncertainties that were too low. The second method produced estimates that were more accurate and better indicators of data quality. As of December 2015, the first method is being used by the ARM Doppler lidar wind value-added product (VAP). One outcome of this work will be to update this VAP to use the second method for uncertainty estimation.« less

  4. Unveiling the unicorn: a leader's guide to ACO preparation.

    PubMed

    Aslin, Paul

    2011-01-01

    The great uncertainty surrounding healthcare reform provides little incentive for action. However, as healthcare leaders wait for final rules and clarity about accountable care organizations (ACOs), inaction is the inappropriate response. Several central themes emerge from research about beginning the ACO process. Leaders should be able to understand and articulate ACO concepts. They should champion embracing cultural change while partnering with physicians. Inventory of skills and capabilities should take place to understand any deficiencies required to implement an ACO. Finally, a plan should be formed by asking strategic questions on each platform needed to ensure performance and strategic goals are at the forefront of decisions regarding structure and function of an ACO. It takes a visionary leader to accept these challenges.

  5. Surrogate pregnancy: a guide for Canadian prenatal health care providers

    PubMed Central

    Reilly, Dan R.

    2007-01-01

    Providing health care for a woman with a surrogate pregnancy involves unique challenges. Although the ethical debate surrounding surrogacy continues, Canada has banned commercial, but not altruistic, surrogacy. In the event of a custody dispute between a surrogate mother and the individual(s) intending to parent the child, it is unclear how Canadian courts would rule. The prenatal health care provider must take extra care to protect the autonomy and privacy rights of the surrogate. There is limited evidence about the medical and psychological risks ofsurrogacy. Whether theoretical concerns about these risks are clinically relevant remains unknown. In the face of these uncertainties, the prenatal health care provider should have a low threshold for seeking obstetrical, social work, ethical and legal support. PMID:17296962

  6. Surrogate pregnancy: a guide for Canadian prenatal health care providers.

    PubMed

    Reilly, Dan R

    2007-02-13

    Providing health care for a woman with a surrogate pregnancy involves unique challenges. Although the ethical debate surrounding surrogacy continues, Canada has banned commercial, but not altruistic, surrogacy. In the event of a custody dispute between a surrogate mother and the individual(s) intending to parent the child, it is unclear how Canadian courts would rule. The prenatal health care provider must take extra care to protect the autonomy and privacy rights of the surrogate. There is limited evidence about the medical and psychological risks of surrogacy. Whether theoretical concerns about these risks are clinically relevant remains unknown. In the face of these uncertainties, the prenatal health care provider should have a low threshold for seeking obstetrical, social work, ethical and legal support.

  7. Price comparisons on the internet based on computational intelligence.

    PubMed

    Kim, Jun Woo; Ha, Sung Ho

    2014-01-01

    Information-intensive Web services such as price comparison sites have recently been gaining popularity. However, most users including novice shoppers have difficulty in browsing such sites because of the massive amount of information gathered and the uncertainty surrounding Web environments. Even conventional price comparison sites face various problems, which suggests the necessity of a new approach to address these problems. Therefore, for this study, an intelligent product search system was developed that enables price comparisons for online shoppers in a more effective manner. In particular, the developed system adopts linguistic price ratings based on fuzzy logic to accommodate user-defined price ranges, and personalizes product recommendations based on linguistic product clusters, which help online shoppers find desired items in a convenient manner.

  8. Price Comparisons on the Internet Based on Computational Intelligence

    PubMed Central

    Kim, Jun Woo; Ha, Sung Ho

    2014-01-01

    Information-intensive Web services such as price comparison sites have recently been gaining popularity. However, most users including novice shoppers have difficulty in browsing such sites because of the massive amount of information gathered and the uncertainty surrounding Web environments. Even conventional price comparison sites face various problems, which suggests the necessity of a new approach to address these problems. Therefore, for this study, an intelligent product search system was developed that enables price comparisons for online shoppers in a more effective manner. In particular, the developed system adopts linguistic price ratings based on fuzzy logic to accommodate user-defined price ranges, and personalizes product recommendations based on linguistic product clusters, which help online shoppers find desired items in a convenient manner. PMID:25268901

  9. Progenitors of Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Hirschi, R.; Arnett, D.; Cristini, A.; Georgy, C.; Meakin, C.; Walkington, I.

    2017-02-01

    Massive stars have a strong impact on their surroundings, in particular when they produce a core-collapse supernova at the end of their evolution. In these proceedings, we review the general evolution of massive stars and their properties at collapse as well as the transition between massive and intermediate-mass stars. We also summarise the effects of metallicity and rotation. We then discuss some of the major uncertainties in the modelling of massive stars, with a particular emphasis on the treatment of convection in 1D stellar evolution codes. Finally, we present new 3D hydrodynamic simulations of convection in carbon burning and list key points to take from 3D hydrodynamic studies for the development of new prescriptions for convective boundary mixing in 1D stellar evolution codes.

  10. Beyond PSA: are new prostate cancer biomarkers of potential value to New Zealand doctors?

    PubMed

    Ng, Lance; Karunasinghe, Nishi; Benjamin, Challaraj S; Ferguson, Lynnette R

    2012-04-20

    The widespread introduction of prostate-specific antigen (PSA) screening has enhanced the early detection of prostate cancer within New Zealand. However, uncertainties associated with the test make it difficult to confidently differentiate low-risk patients from those that require a definitive diagnostic biopsy. In consequence, the decisions surrounding prostate cancer treatment become extremely difficult. A number of new tests have become available which might have the potential to complement the current PSA screens. We review a number of the best validated of these which provide data that, although currently not available in clinical practice, some of these might have considerable potential to aid diagnosis, prognosis and therapeutic decisions for men with prostate cancer in New Zealand.

  11. From medical invention to clinical practice: the reimbursement challenge facing new device procedures and technology--part 2: coverage.

    PubMed

    Raab, G Gregory; Parr, David H

    2006-10-01

    This paper, the second of 3 that discuss the reimbursement challenges facing new medical device technology in various issues of this journal, explains the key aspects of coverage that affect the adoption of medical devices. The process Medicare uses to make coverage determinations has become more timely and open over the past several years, but it still lacks the predictability that product innovators prefer. The continued uncertainty surrounding evidence requirements undermines the predictability needed for optimal product planning and innovation. Recent steps taken by the Centers for Medicare and Medicaid Services to provide coverage in return for evidence development should provide patients with access to promising new technologies and procedures while generating important evidence concerning their effectiveness.

  12. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  13. Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced Biomass Feedstock Logistics Supply Chains in Kansas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long

    To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less

  14. Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced Biomass Feedstock Logistics Supply Chains in Kansas

    DOE PAGES

    Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long; ...

    2014-11-04

    To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less

  15. Geospatial decision support systems for societal decision making

    USGS Publications Warehouse

    Bernknopf, R.L.

    2005-01-01

    While science provides reliable information to describe and understand the earth and its natural processes, it can contribute more. There are many important societal issues in which scientific information can play a critical role. Science can add greatly to policy and management decisions to minimize loss of life and property from natural and man-made disasters, to manage water, biological, energy, and mineral resources, and in general, to enhance and protect our quality of life. However, the link between science and decision-making is often complicated and imperfect. Technical language and methods surround scientific research and the dissemination of its results. Scientific investigations often are conducted under different conditions, with different spatial boundaries, and in different timeframes than those needed to support specific policy and societal decisions. Uncertainty is not uniformly reported in scientific investigations. If society does not know that data exist, what the data mean, where to use the data, or how to include uncertainty when a decision has to be made, then science gets left out -or misused- in a decision making process. This paper is about using Geospatial Decision Support Systems (GDSS) for quantitative policy analysis. Integrated natural -social science methods and tools in a Geographic Information System that respond to decision-making needs can be used to close the gap between science and society. The GDSS has been developed so that nonscientists can pose "what if" scenarios to evaluate hypothetical outcomes of policy and management choices. In this approach decision makers can evaluate the financial and geographic distribution of potential policy options and their societal implications. Actions, based on scientific information, can be taken to mitigate hazards, protect our air and water quality, preserve the planet's biodiversity, promote balanced land use planning, and judiciously exploit natural resources. Applications using the GDSS have demonstrated the benefits of utilizing science for policy decisions. Investment in science reduces decision-making uncertainty and reducing that uncertainty has economic value.

  16. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frothingham, David; Barker, Michelle; Buechi, Steve

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less

  17. Hydrology and phosphorus transport simulation in a lowland polder by a coupled modeling system.

    PubMed

    Yan, Renhua; Huang, Jiacong; Li, Lingling; Gao, Junfeng

    2017-08-01

    Modeling the rain-runoff processes and phosphorus transport processes in lowland polders is critical in finding reasonable measures to alleviate the eutrophication problem of downstream rivers and lakes. This study develops a lowland Polder Hydrology and Phosphorus modeling System (PHPS) by coupling the WALRUS-paddy model and an improved phosphorus module of a Phosphorus Dynamic model for lowland Polder systems (PDP). It considers some important hydrological characteristics, such as groundwater-unsaturated zone coupling, groundwater-surface water feedback, human-controlled irrigation and discharge, and detailed physical and biochemical cycles of phosphorus in surface water. The application of the model in the Jianwei polder shows that the simulated phosphorus matches well with the measured values. The high precision of this model combined with its low input data requirement and efficient computation make it practical and easy to the water resources management of Chinese polders. Parameter sensitivity analysis demonstrates that K uptake , c Q2 , c W1 , and c Q1 exert a significant effect on the modeled results, whereas K resuspensionMax , K settling , and K mineralization have little effect on the modeled total phosphorus. Among the three types of uncertainties (i.e., parameter, initial condition, and forcing uncertainties), forcing uncertainty produces the strongest effect on the simulated phosphorus. Based on the analysis result of annual phosphorus balance when considering the high import from irrigation and fertilization, lowland polder is capable of retaining phosphorus and reducing phosphorus export to surrounding aquatic ecosystems because of their special hydrological regulation regime. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Simulating the impacts of disturbances on forest carbon cycling in North America: Processes, data, models, and challenges

    USGS Publications Warehouse

    Liu, Shuguang; Bond-Lamberty, Ben; Hicke, Jeffrey A.; Vargas, Rodrigo; Zhao, Shuqing; Chen, Jing; Edburg, Steven L.; Hu, Yueming; Liu, Jinxun; McGuire, A. David; Xiao, Jingfeng; Keane, Robert; Yuan, Wenping; Tang, Jianwu; Luo, Yiqi; Potter, Christopher; Oeding, Jennifer

    2011-01-01

    Forest disturbances greatly alter the carbon cycle at various spatial and temporal scales. It is critical to understand disturbance regimes and their impacts to better quantify regional and global carbon dynamics. This review of the status and major challenges in representing the impacts of disturbances in modeling the carbon dynamics across North America revealed some major advances and challenges. First, significant advances have been made in representation, scaling, and characterization of disturbances that should be included in regional modeling efforts. Second, there is a need to develop effective and comprehensive process‐based procedures and algorithms to quantify the immediate and long‐term impacts of disturbances on ecosystem succession, soils, microclimate, and cycles of carbon, water, and nutrients. Third, our capability to simulate the occurrences and severity of disturbances is very limited. Fourth, scaling issues have rarely been addressed in continental scale model applications. It is not fully understood which finer scale processes and properties need to be scaled to coarser spatial and temporal scales. Fifth, there are inadequate databases on disturbances at the continental scale to support the quantification of their effects on the carbon balance in North America. Finally, procedures are needed to quantify the uncertainty of model inputs, model parameters, and model structures, and thus to estimate their impacts on overall model uncertainty. Working together, the scientific community interested in disturbance and its impacts can identify the most uncertain issues surrounding the role of disturbance in the North American carbon budget and develop working hypotheses to reduce the uncertainty

  19. Decisional conflict in economically disadvantaged men with newly diagnosed prostate cancer: Results from a shared decision-making trial

    PubMed Central

    Kaplan, Alan L.; Crespi, Catherine M.; Saucedo, Josemanuel D.; Connor, Sarah E.; Litwin, Mark S.; Saigal, Christopher S.

    2015-01-01

    BACKGROUND Decisional conflict is a source of anxiety and stress for men diagnosed with prostate cancer given uncertainty surrounding myriad treatment options. Few data exist to help clinicians identify which patients are at risk for decisional conflict. The purpose of this study was to examine factors associated with decisional conflict in economically disadvantaged men diagnosed with prostate cancer before any treatment choices were made. METHODS A total of 70 men were surveyed at a Veterans Administration clinic with newly diagnosed localized prostate cancer enrolled in a randomized trial testing a novel shared decision-making tool. Baseline demographic, clinical, and functional data were collected. Independent variables included age, race, education, comorbidity, relationship status, urinary/sexual dysfunction, and prostate cancer knowledge. Tested outcomes were Decisional Conflict Scale, Uncertainty Subscale, and Perceived Effectiveness Subscale. Multiple linear regression modeling was used to identify factors associated with decisional conflict. RESULTS Mean age was 63 years, 49% were African American, and 70% reported an income less than $30,000. Poor prostate cancer knowledge was associated with increased decisional conflict and higher uncertainty (P < .001 and P = 0.001, respectively). Poor knowledge was also associated with lower perceived effectiveness (P = 0.003) whereas being in a relationship was associated with higher decisional conflict (P = 0.03). CONCLUSIONS Decreased patient knowledge about prostate cancer is associated with increased decisional conflict and lower perceived effective decision-making. Interventions to increase comprehension of prostate cancer and its treatments may reduce decisional conflict. Further work is needed to better characterize this relationship and identify effective targeted interventions. PMID:24816472

  20. Data Availability in Appliance Standards and Labeling Program Development and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romankiewicz, John; Khanna, Nina; Vine, Edward

    2013-05-01

    In this report, we describe the necessary data inputs for both standards development and program evaluation and perform an initial assessment of the availability and uncertainty of those data inputs in China. For standards development, we find that China and its standards and labeling program administrators currently has access to the basic market and technical data needed for conducting market and technology assessment and technological and economic analyses. Some data, such as shipments data, is readily available from the China Energy Label product registration database while the availability of other data, including average unit energy consumption, prices and design options,more » needs improvement. Unlike some other countries such as the United States, most of the necessary data for conducting standards development analyses are not publicly available or compiled in a consolidated data source. In addition, improved data on design and efficiency options as well as cost data (e.g., manufacturing costs, mark-ups, production and product use-phase costs) – key inputs to several technoeconomic analyses – are particularly in need given China’s unconsolidated manufacturing industry. For program evaluation, we find that while China can conduct simple savings evaluations on its incentive programs with the data it currently has available from the Ministry of Finance – the program administrator, the savings estimates produced by such an evaluation will carry high uncertainty. As such, China could benefit from an increase in surveying and metering in the next one to three years to decrease the uncertainty surrounding key data points such as unit energy savings and free ridership.« less

  1. Lidar backscattering measurements of background stratospheric aerosols

    NASA Technical Reports Server (NTRS)

    Remsberg, E. E.; Northam, G. B.; Butler, C. F.

    1979-01-01

    A comparative lidar-dustsonde experiment was conducted in San Angelo, Texas, in May 1974 in order to estimate the uncertainties in stratospheric-aerosol backscatter for the NASA Langley 48-inch lidar system. The lidar calibration and data-analysis procedures are discussed. Results from the Texas experiment indicate random and systematic uncertainties of 35 and 63 percent, respectively, in backscatter from a background stratospheric-aerosol layer at 20 km.

  2. Uncertainty in solid precipitation and snow depth prediction for Siberia using the Noah and Noah-MP land surface models

    NASA Astrophysics Data System (ADS)

    Suzuki, Kazuyoshi; Zupanski, Milija

    2018-01-01

    In this study, we investigate the uncertainties associated with land surface processes in an ensemble predication context. Specifically, we compare the uncertainties produced by a coupled atmosphere-land modeling system with two different land surface models, the Noah- MP land surface model (LSM) and the Noah LSM, by using the Maximum Likelihood Ensemble Filter (MLEF) data assimilation system as a platform for ensemble prediction. We carried out 24-hour prediction simulations in Siberia with 32 ensemble members beginning at 00:00 UTC on 5 March 2013. We then compared the model prediction uncertainty of snow depth and solid precipitation with observation-based research products and evaluated the standard deviation of the ensemble spread. The prediction skill and ensemble spread exhibited high positive correlation for both LSMs, indicating a realistic uncertainty estimation. The inclusion of a multiple snowlayer model in the Noah-MP LSM was beneficial for reducing the uncertainties of snow depth and snow depth change compared to the Noah LSM, but the uncertainty in daily solid precipitation showed minimal difference between the two LSMs. The impact of LSM choice in reducing temperature uncertainty was limited to surface layers of the atmosphere. In summary, we found that the more sophisticated Noah-MP LSM reduces uncertainties associated with land surface processes compared to the Noah LSM. Thus, using prediction models with improved skill implies improved predictability and greater certainty of prediction.

  3. Essentialist beliefs, sexual identity uncertainty, internalized homonegativity and psychological wellbeing in gay men.

    PubMed

    Morandini, James S; Blaszczynski, Alexander; Ross, Michael W; Costa, Daniel S J; Dar-Nimrod, Ilan

    2015-07-01

    The present study examined essentialist beliefs about sexual orientation and their implications for sexual identity uncertainty, internalized homonegativity and psychological wellbeing in a sample of gay men. A combination of targeted sampling and snowball strategies were used to recruit 639 gay identifying men for a cross-sectional online survey. Participants completed a questionnaire assessing sexual orientation beliefs, sexual identity uncertainty, internalized homonegativity, and psychological wellbeing outcomes. Structural equation modeling was used to test whether essentialist beliefs were associated with psychological wellbeing indirectly via their effect on sexual identity uncertainty and internalized homonegativity. A unique pattern of direct and indirect effects was observed in which facets of essentialism predicted sexual identity uncertainty, internalized homonegativity and psychological wellbeing. Of note, viewing sexual orientation as immutable/biologically based and as existing in discrete categories, were associated with less sexual identity uncertainty. On the other hand, these beliefs had divergent relationships with internalized homonegativity, with immutability/biological beliefs associated with lower, and discreteness beliefs associated with greater internalized homonegativity. Of interest, although sexual identity uncertainty was associated with poorer psychological wellbeing via its contribution to internalized homophobia, there was no direct relationship between identity uncertainty and psychological wellbeing. Findings indicate that essentializing sexual orientation has mixed implications for sexual identity uncertainty and internalized homonegativity and wellbeing in gay men. Those undertaking educational and clinical interventions with gay men should be aware of the benefits and of caveats of essentialist theories of homosexuality for this population. (c) 2015 APA, all rights reserved).

  4. Impact of measurement uncertainty from experimental load distribution factors on bridge load rating

    NASA Astrophysics Data System (ADS)

    Gangone, Michael V.; Whelan, Matthew J.

    2018-03-01

    Load rating and testing of highway bridges is important in determining the capacity of the structure. Experimental load rating utilizes strain transducers placed at critical locations of the superstructure to measure normal strains. These strains are then used in computing diagnostic performance measures (neutral axis of bending, load distribution factor) and ultimately a load rating. However, it has been shown that experimentally obtained strain measurements contain uncertainties associated with the accuracy and precision of the sensor and sensing system. These uncertainties propagate through to the diagnostic indicators that in turn transmit into the load rating calculation. This paper will analyze the effect that measurement uncertainties have on the experimental load rating results of a 3 span multi-girder/stringer steel and concrete bridge. The focus of this paper will be limited to the uncertainty associated with the experimental distribution factor estimate. For the testing discussed, strain readings were gathered at the midspan of each span of both exterior girders and the center girder. Test vehicles of known weight were positioned at specified locations on each span to generate maximum strain response for each of the five girders. The strain uncertainties were used in conjunction with a propagation formula developed by the authors to determine the standard uncertainty in the distribution factor estimates. This distribution factor uncertainty is then introduced into the load rating computation to determine the possible range of the load rating. The results show the importance of understanding measurement uncertainty in experimental load testing.

  5. Uncertainty in multispectral lidar signals caused by incidence angle effects

    PubMed Central

    Nevalainen, Olli; Hakala, Teemu; Kaasalainen, Mikko

    2018-01-01

    Multispectral terrestrial laser scanning (TLS) is an emerging technology. Several manufacturers already offer commercial dual or three wavelength airborne laser scanners, while multispectral TLS is still carried out mainly with research instruments. Many of these research efforts have focused on the study of vegetation. The aim of this paper is to study the uncertainty of the measurement of spectral indices of vegetation with multispectral lidar. Using two spectral indices as examples, we find that the uncertainty is due to systematic errors caused by the wavelength dependency of laser incidence angle effects. This finding is empirical, and the error cannot be removed by modelling or instrument modification. The discovery and study of these effects has been enabled by hyperspectral and multispectral TLS, and it has become a subject of active research within the past few years. We summarize the most recent studies on multi-wavelength incidence angle effects and present new results on the effect of specular reflection from the leaf surface, and the surface structure, which have been suggested to play a key role. We also discuss the consequences to the measurement of spectral indices with multispectral TLS, and a possible correction scheme using a synthetic laser footprint. PMID:29503718

  6. Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations

    NASA Technical Reports Server (NTRS)

    Pease, G. E.; Hendrickson, H. T.

    1980-01-01

    A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.

  7. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    NASA Astrophysics Data System (ADS)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  8. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    USGS Publications Warehouse

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.

  9. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast

  10. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has the benefit of increasing the transparency of occupational exposure limit derivation. Improved characterization of the scientific basis for uncertainty factors has led to increasing rigor and transparency in their application as part of the overall occupational exposure limit derivation process.

  11. Uncertainty in recharge estimation: impact on groundwater vulnerability assessments for the Pearl Harbor Basin, O'ahu, Hawai'i, U.S.A.

    NASA Astrophysics Data System (ADS)

    Giambelluca, Thomas W.; Loague, Keith; Green, Richard E.; Nullet, Michael A.

    1996-06-01

    In this paper, uncertainty in recharge estimates is investigated relative to its impact on assessments of groundwater contamination vulnerability using a relatively simple pesticide mobility index, attenuation factor (AF). We employ a combination of first-order uncertainty analysis (FOUA) and sensitivity analysis to investigate recharge uncertainties for agricultural land on the island of O'ahu, Hawai'i, that is currently, or has been in the past, under sugarcane or pineapple cultivation. Uncertainty in recharge due to recharge component uncertainties is 49% of the mean for sugarcane and 58% of the mean for pineapple. The components contributing the largest amounts of uncertainty to the recharge estimate are irrigation in the case of sugarcane and precipitation in the case of pineapple. For a suite of pesticides formerly or currently used in the region, the contribution to AF uncertainty of recharge uncertainty was compared with the contributions of other AF components: retardation factor (RF), a measure of the effects of sorption; soil-water content at field capacity (ΘFC); and pesticide half-life (t1/2). Depending upon the pesticide, the contribution of recharge to uncertainty ranks second or third among the four AF components tested. The natural temporal variability of recharge is another source of uncertainty in AF, because the index is calculated using the time-averaged recharge rate. Relative to the mean, recharge variability is 10%, 44%, and 176% for the annual, monthly, and daily time scales, respectively, under sugarcane, and 31%, 112%, and 344%, respectively, under pineapple. In general, uncertainty in AF associated with temporal variability in recharge at all time scales exceeds AF. For chemicals such as atrazine or diuron under sugarcane, and atrazine or bromacil under pineapple, the range of AF uncertainty due to temporal variability in recharge encompasses significantly higher levels of leaching potential at some locations than that indicated by the AF estimate.

  12. GABA predicts visual intelligence.

    PubMed

    Cook, Emily; Hammett, Stephen T; Larsson, Jonas

    2016-10-06

    Early psychological researchers proposed a link between intelligence and low-level perceptual performance. It was recently suggested that this link is driven by individual variations in the ability to suppress irrelevant information, evidenced by the observation of strong correlations between perceptual surround suppression and cognitive performance. However, the neural mechanisms underlying such a link remain unclear. A candidate mechanism is neural inhibition by gamma-aminobutyric acid (GABA), but direct experimental support for GABA-mediated inhibition underlying suppression is inconsistent. Here we report evidence consistent with a global suppressive mechanism involving GABA underlying the link between sensory performance and intelligence. We measured visual cortical GABA concentration, visuo-spatial intelligence and visual surround suppression in a group of healthy adults. Levels of GABA were strongly predictive of both intelligence and surround suppression, with higher levels of intelligence associated with higher levels of GABA and stronger surround suppression. These results indicate that GABA-mediated neural inhibition may be a key factor determining cognitive performance and suggests a physiological mechanism linking surround suppression and intelligence. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Statistics Analysis of the Uncertainties in Cloud Optical Depth Retrievals Caused by Three-Dimensional Radiative Effects

    NASA Technical Reports Server (NTRS)

    Varnai, Tamas; Marshak, Alexander

    2000-01-01

    This paper presents a simple approach to estimate the uncertainties that arise in satellite retrievals of cloud optical depth when the retrievals use one-dimensional radiative transfer theory for heterogeneous clouds that have variations in all three dimensions. For the first time, preliminary error bounds are set to estimate the uncertainty of cloud optical depth retrievals. These estimates can help us better understand the nature of uncertainties that three-dimensional effects can introduce into retrievals of this important product of the MODIS instrument. The probability distribution of resulting retrieval errors is examined through theoretical simulations of shortwave cloud reflection for a wide variety of cloud fields. The results are used to illustrate how retrieval uncertainties change with observable and known parameters, such as solar elevation or cloud brightness. Furthermore, the results indicate that a tendency observed in an earlier study, clouds appearing thicker for oblique sun, is indeed caused by three-dimensional radiative effects.

  14. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  15. Uncertainty and research needs for supplementing wild populations of anadromous Pacific salmon

    USGS Publications Warehouse

    Reisenbichler, R.R.

    2005-01-01

    Substantial disagreement and uncertainty attend the question of whether the benefits from supplementing wild populations of anadromous salmonids with hatchery fish outweigh the risks. Prudent decisions about supplementation are most likely when the suite of potential benefits and hazards and the various sources of uncertainty are explicitly identified. Models help by indicating the potential consequences of various levels of supplementation but perhaps are most valuable for showing the limitations of available data and helping design studies and monitoring to provide critical data. Information and understanding about the issue are deficient. I discuss various benefits, hazards, and associated uncertainties for supplementation, and implications for the design of monitoring and research. Several studies to reduce uncertainty and facilitate prudent supplementation are described and range from short-term reductionistic studies that help define the issue or help avoid deleterious consequences from supplementation to long-term studies (ca. 10 or more fish generations) that evaluate the net result of positive and negative genetic, behavioral, and ecological effects from supplementation.

  16. Multiobjective design of aquifer monitoring networks for optimal spatial prediction and geostatistical parameter estimation

    NASA Astrophysics Data System (ADS)

    Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.

    2013-06-01

    Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that the effect of uncertainties associated with the geostatistical parameters on the spatial prediction might be significantly alleviated (by up to 80% of the prior uncertainty in K and by 90% of the prior uncertainty in H) by sampling evenly distributed measurements with a spatial measurement density of more than 1 observation per 60 m × 60 m grid block. In addition, exploration of the interaction of objective functions indicates that the ability of head measurements to reduce the uncertainty associated with the correlation scale is comparable to the effect of hydraulic conductivity measurements.

  17. Communication about scientific uncertainty in environmental nanoparticle research - a comparison of scientific literature and mass media

    NASA Astrophysics Data System (ADS)

    Heidmann, Ilona; Milde, Jutta

    2014-05-01

    The research about the fate and behavior of engineered nanoparticles in the environment is despite its wide applications still in the early stages. 'There is a high level of scientific uncertainty in nanoparticle research' is often stated in the scientific community. Knowledge about these uncertainties might be of interest to other scientists, experts and laymen. But how could these uncertainties be characterized and are they communicated within the scientific literature and the mass media? To answer these questions, the current state of scientific knowledge about scientific uncertainty through the example of environmental nanoparticle research was characterized and the communication of these uncertainties within the scientific literature is compared with its media coverage in the field of nanotechnologies. The scientific uncertainty within the field of environmental fate of nanoparticles is by method uncertainties and a general lack of data concerning the fate and effects of nanoparticles and their mechanisms in the environment, and by the uncertain transferability of results to the environmental system. In the scientific literature, scientific uncertainties, their sources, and consequences are mentioned with different foci and to a different extent. As expected, the authors in research papers focus on the certainty of specific results within their specific research question, whereas in review papers, the uncertainties due to a general lack of data are emphasized and the sources and consequences are discussed in a broader environmental context. In the mass media, nanotechnology is often framed as rather certain and positive aspects and benefits are emphasized. Although reporting about a new technology, only in one-third of the reports scientific uncertainties are mentioned. Scientific uncertainties are most often mentioned together with risk and they arise primarily from unknown harmful effects to human health. Environmental issues itself are seldom mentioned. Scientific uncertainties, sources, and consequences have been most widely discussed in the review papers. Research papers and mass media tend to emphasize more the certainty of their scientific results or the benefits of the nanotechnology applications. Neither the broad spectrum nor any specifications of uncertainties have been communicated. This indicates that there has been no effective dialogue over scientific uncertainty with the public so far.

  18. When, not if: The inescapability of an uncertain future

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Ballard, T.

    2014-12-01

    Uncertainty is an inherent feature of most scientific endeavours, and many political decisions must be made in the presence of scientific uncertainty. In the case of climate change, there is evidence that greater scientific uncertainty increases the risk associated with the impact of climate change. Scientific uncertainty thus provides an impetus for cutting emissions rather than delaying action. In contrast to those normative considerations, uncertainty is frequently cited in political and public discourse as a reason to delay mitigation. We examine ways in which this gap between public and scientific understanding of uncertainty can be bridged. In particular, we sought ways to communicate uncertainty in a way that better calibrates people's risk perceptions with the projected impact of climate change. We report two behavioural experiments in which uncertainty about the future was expressed either as outcome uncertainty or temporal uncertainty. The conventional presentation of uncertainty involves uncertainty about an outcome at a given time—for example, the range of possible sea level rise (say 50cm +/- 20cm) by a certain date. An alternative presentation of the same situation presents a certain outcome ("sea levels will rise by 50cm") but places the uncertainty into the time of arrival ("this may occur as early as 2040 or as late as 2080"). We presented participants with a series of statements and graphs indicating projected increases in temperature, sea levels, ocean acidification, and a decrease in artic sea ice. In the uncertain magnitude condition, the statements and graphs reported the upper and lower confidence bounds of the projected magnitude and the mean projected time of arrival. In the uncertain time of arrival condition, they reported the upper and lower confidence bounds of the projected time of arrival and the mean projected magnitude. The results show that when uncertainty was presented as uncertain time of arrival rather than an uncertain outcome, people expressed greater concern about the projected outcomes. In a further experiment involving repeated "games" with a simulated economy, we similarly showed that people allocate more resources to mitigation if there is uncertainty about the timing of an adverse event rather than about the magnitude of its impact.

  19. Assessment of composite index methods for agricultural vulnerability to climate change.

    PubMed

    Wiréhn, Lotten; Danielsson, Åsa; Neset, Tina-Simone S

    2015-06-01

    A common way of quantifying and communicating climate vulnerability is to calculate composite indices from indicators, visualizing these as maps. Inherent methodological uncertainties in vulnerability assessments, however, require greater attention. This study examines Swedish agricultural vulnerability to climate change, the aim being to review various indicator approaches for assessing agricultural vulnerability to climate change and to evaluate differences in climate vulnerability depending on the weighting and summarizing methods. The reviewed methods are evaluated by being tested at the municipal level. Three weighting and summarizing methods, representative of climate vulnerability indices in general, are analysed. The results indicate that 34 of 36 method combinations differ significantly from each other. We argue that representing agricultural vulnerability in a single composite index might be insufficient to guide climate adaptation. We emphasize the need for further research into how to measure and visualize agricultural vulnerability and into how to communicate uncertainties in both data and methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Robust Bayesian linear regression with application to an analysis of the CODATA values for the Planck constant

    NASA Astrophysics Data System (ADS)

    Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens

    2018-02-01

    Weighted least-squares estimation is commonly applied in metrology to fit models to measurements that are accompanied with quoted uncertainties. The weights are chosen in dependence on the quoted uncertainties. However, when data and model are inconsistent in view of the quoted uncertainties, this procedure does not yield adequate results. When it can be assumed that all uncertainties ought to be rescaled by a common factor, weighted least-squares estimation may still be used, provided that a simple correction of the uncertainty obtained for the estimated model is applied. We show that these uncertainties and credible intervals are robust, as they do not rely on the assumption of a Gaussian distribution of the data. Hence, common software for weighted least-squares estimation may still safely be employed in such a case, followed by a simple modification of the uncertainties obtained by that software. We also provide means of checking the assumptions of such an approach. The Bayesian regression procedure is applied to analyze the CODATA values for the Planck constant published over the past decades in terms of three different models: a constant model, a straight line model and a spline model. Our results indicate that the CODATA values may not have yet stabilized.

  1. Techno-economic and uncertainty analysis of in situ and ex situ fast pyrolysis for biofuel production.

    PubMed

    Li, Boyan; Ou, Longwen; Dang, Qi; Meyer, Pimphan; Jones, Susanne; Brown, Robert; Wright, Mark

    2015-11-01

    This study evaluates the techno-economic uncertainty in cost estimates for two emerging technologies for biofuel production: in situ and ex situ catalytic pyrolysis. The probability distributions for the minimum fuel-selling price (MFSP) indicate that in situ catalytic pyrolysis has an expected MFSP of $1.11 per liter with a standard deviation of 0.29, while the ex situ catalytic pyrolysis has a similar MFSP with a smaller deviation ($1.13 per liter and 0.21 respectively). These results suggest that a biorefinery based on ex situ catalytic pyrolysis could have a lower techno-economic uncertainty than in situ pyrolysis compensating for a slightly higher MFSP cost estimate. Analysis of how each parameter affects the NPV indicates that internal rate of return, feedstock price, total project investment, electricity price, biochar yield and bio-oil yield are parameters which have substantial impact on the MFSP for both in situ and ex situ catalytic pyrolysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  3. A Department of Motor Vehicle-Based Intervention to Promote Organ Donor Registrations in New York State.

    PubMed

    Feeley, Thomas Hugh; Anker, Ashley E; Evans, Melanie; Reynolds-Tylus, Tobias

    2017-09-01

    Examination of efficacy of motor vehicle representative educational training and dissemination of promotional materials as a means to promote organ donation enrollments in New York State. To increase the number of New York State residents who consent to donation through the department of motor vehicle transactions during project period. County-run motor vehicle offices across New York State. Customers who present to New York Department of Motor Vehicle offices and the representative who work at designated bureaus. point-of-decision materials including promotional posters, brochures, website, and the motor vehicle representative training sessions. Reasons for enrollment decision, knowledge/experience with donation, monthly consent rates, enrollment in state organ, and tissue registry. Customers who elected not to register reported no reason or uncertainty surrounding enrollment. The representatives reported experience with donation, discussion with customers, and need for additional education on organ donation. Enrollment cards were mailed to 799 project staff; counties where offices participated in intervention did not indicate significantly higher monthly enrollments when comparing pre- to postenrollment rates. Use of point-of-decision materials and enrollment cards proved inexpensive method to register customers with a 3.6% return rate. Customers report low (27%) enrollment rate and reticence to consent to donation. Educational training sessions with representatives did not yield significant enrollment increases when evaluating data at county-level enrollment.

  4. The effects of time-varying observation errors on semi-empirical sea-level projections

    DOE PAGES

    Ruckert, Kelsey L.; Guan, Yawen; Bakker, Alexander M. R.; ...

    2016-11-30

    Sea-level rise is a key driver of projected flooding risks. The design of strategies to manage these risks often hinges on projections that inform decision-makers about the surrounding uncertainties. Producing semi-empirical sea-level projections is difficult, for example, due to the complexity of the error structure of the observations, such as time-varying (heteroskedastic) observation errors and autocorrelation of the data-model residuals. This raises the question of how neglecting the error structure impacts hindcasts and projections. Here, we quantify this effect on sea-level projections and parameter distributions by using a simple semi-empirical sea-level model. Specifically, we compare three model-fitting methods: a frequentistmore » bootstrap as well as a Bayesian inversion with and without considering heteroskedastic residuals. All methods produce comparable hindcasts, but the parametric distributions and projections differ considerably based on methodological choices. In conclusion, our results show that the differences based on the methodological choices are enhanced in the upper tail projections. For example, the Bayesian inversion accounting for heteroskedasticity increases the sea-level anomaly with a 1% probability of being equaled or exceeded in the year 2050 by about 34% and about 40% in the year 2100 compared to a frequentist bootstrap. These results indicate that neglecting known properties of the observation errors and the data-model residuals can lead to low-biased sea-level projections.« less

  5. Impact of National Ambient Air Quality Standards Nonattainment Designations on Particulate Pollution and Health.

    PubMed

    Zigler, Corwin M; Choirat, Christine; Dominici, Francesca

    2018-03-01

    Despite dramatic air quality improvement in the United States over the past decades, recent years have brought renewed scrutiny and uncertainty surrounding the effectiveness of specific regulatory programs for continuing to improve air quality and public health outcomes. We employ causal inference methods and a spatial hierarchical regression model to characterize the extent to which a designation of "nonattainment" with the 1997 National Ambient Air Quality Standard for ambient fine particulate matter (PM2.5) in 2005 causally affected ambient PM2.5 and health outcomes among over 10 million Medicare beneficiaries in the Eastern United States in 2009-2012. We found that, on average across all retained study locations, reductions in ambient PM2.5 and Medicare health outcomes could not be conclusively attributed to the nonattainment designations against the backdrop of other regional strategies that impacted the entire Eastern United States. A more targeted principal stratification analysis indicates substantial health impacts of the nonattainment designations among the subset of areas where the designations are estimated to have actually reduced ambient PM2.5 beyond levels achieved by regional measures, with noteworthy reductions in all-cause mortality, chronic obstructive pulmonary disorder, heart failure, ischemic heart disease, and respiratory tract infections. These findings provide targeted evidence of the effectiveness of local control measures after nonattainment designations for the 1997 PM2.5 air quality standard.

  6. Human computer interactions in next-generation of aircraft smart navigation management systems: task analysis and architecture under an agent-oriented methodological approach.

    PubMed

    Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B

    2015-03-04

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.

  7. Human Computer Interactions in Next-Generation of Aircraft Smart Navigation Management Systems: Task Analysis and Architecture under an Agent-Oriented Methodological Approach

    PubMed Central

    Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.

    2015-01-01

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092

  8. The effects of time-varying observation errors on semi-empirical sea-level projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruckert, Kelsey L.; Guan, Yawen; Bakker, Alexander M. R.

    Sea-level rise is a key driver of projected flooding risks. The design of strategies to manage these risks often hinges on projections that inform decision-makers about the surrounding uncertainties. Producing semi-empirical sea-level projections is difficult, for example, due to the complexity of the error structure of the observations, such as time-varying (heteroskedastic) observation errors and autocorrelation of the data-model residuals. This raises the question of how neglecting the error structure impacts hindcasts and projections. Here, we quantify this effect on sea-level projections and parameter distributions by using a simple semi-empirical sea-level model. Specifically, we compare three model-fitting methods: a frequentistmore » bootstrap as well as a Bayesian inversion with and without considering heteroskedastic residuals. All methods produce comparable hindcasts, but the parametric distributions and projections differ considerably based on methodological choices. In conclusion, our results show that the differences based on the methodological choices are enhanced in the upper tail projections. For example, the Bayesian inversion accounting for heteroskedasticity increases the sea-level anomaly with a 1% probability of being equaled or exceeded in the year 2050 by about 34% and about 40% in the year 2100 compared to a frequentist bootstrap. These results indicate that neglecting known properties of the observation errors and the data-model residuals can lead to low-biased sea-level projections.« less

  9. Reducing the Risk of Human Space Missions with INTEGRITY

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.

    2003-01-01

    The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.

  10. Method development for analysis of urban dust using scanning electron microscopy with energy dispersive x-ray spectrometry to detect the possible presence of world trade center dust constituents

    USGS Publications Warehouse

    Bern, A.M.; Lowers, H.A.; Meeker, G.P.; Rosati, J.A.

    2009-01-01

    The collapse of the World Trade Center Towers on September 11, 2001, sent dust and debris across much of Manhattan and in the surrounding areas. Indoor and outdoor dust samples were collected and characterized by U.S. Geological Survey (USGS) scientists using scanning electron microscopy with energy-dispersive spectrometry (SEM/EDS). From this characterization, the U.S. Environmental Protection Agency and USGS developed a particulate screening method to determine the presence of residual World Trade Center dust in the indoor environment using slag wool as a primary "signature". The method describes a procedure that includes splitting, ashing, and sieving of collected dust. From one split, a 10 mg/mL dust/ isopropanol suspension was prepared and 10-30 ??L aliquots of the suspension placed on an SEM substrate. Analyses were performed using SEM/EDS manual point counting for slag wool fibers. Poisson regression was used to identify some of the sources of uncertainty, which are directly related to the small number of fibers present on each sample stub. Preliminary results indicate that the procedure is promising for screening urban background dust for the presence of WTC dust. Consistent sample preparation of reference materials and samples must be performed by each laboratory wishing to use this method to obtain meaningful and accurate results. ?? 2009 American Chemical Society.

  11. Stochastic integrated assessment of climate tipping points indicates the need for strict climate policy

    NASA Astrophysics Data System (ADS)

    Lontzek, Thomas S.; Cai, Yongyang; Judd, Kenneth L.; Lenton, Timothy M.

    2015-05-01

    Perhaps the most `dangerous’ aspect of future climate change is the possibility that human activities will push parts of the climate system past tipping points, leading to irreversible impacts. The likelihood of such large-scale singular events is expected to increase with global warming, but is fundamentally uncertain. A key question is how should the uncertainty surrounding tipping events affect climate policy? We address this using a stochastic integrated assessment model, based on the widely used deterministic DICE model. The temperature-dependent likelihood of tipping is calibrated using expert opinions, which we find to be internally consistent. The irreversible impacts of tipping events are assumed to accumulate steadily over time (rather than instantaneously), consistent with scientific understanding. Even with conservative assumptions about the rate and impacts of a stochastic tipping event, today’s optimal carbon tax is increased by ~50%. For a plausibly rapid, high-impact tipping event, today’s optimal carbon tax is increased by >200%. The additional carbon tax to delay climate tipping grows at only about half the rate of the baseline carbon tax. This implies that the effective discount rate for the costs of stochastic climate tipping is much lower than the discount rate for deterministic climate damages. Our results support recent suggestions that the costs of carbon emission used to inform policy are being underestimated, and that uncertain future climate damages should be discounted at a low rate.

  12. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  13. Association of reticular cells with CD34+/Sca-1+ apoptotic cells in the hemopoietic organ of grasshopper, Euprepocnemis shirakii.

    PubMed

    Lim, Jong Yeon; Lee, Bong Hee; Kang, Seok Woo; Wago, Haruhisa; Han, Sung Sik

    2004-07-01

    Hemopoiesis in orthopteran insects occurs in a hemopoietic organ that is located bilaterally along the aorta. This organ is also known as a reticulo-hemopoietic organ because of the rich presence of reticular cells. This study was performed to further elucidate hemopoiesis in the reticulo-hemopoietic organ of an orthopteran, Euprepocnemis shirakii. We focused on the question why reticular cells are so abundant (35% of cells in hemopoietic organ). Interestingly, 21% of these reticular cells surrounded hemocytes with their reticular cytoplasm. The surrounded hemocytes were distinguished by their different size and darkly stained nucleus. These cells were characterized by immunostaining using antibodies against several types of hemocytes: 45% of the surrounded hemocytes were CD34+, and these positive cells were double stained (over 85%) when immunostained by another hemopoietic pluripotent cell marker, Sca-1. Transmission electron microscopic analysis showed that reticular cells surrounded hemocytes containing large nuclei and poorly developed cytoplasmic organelles. This strongly suggests that the reticular cells surround hemopoietic stem cells. Additionally, surrounded hemopoietic progenitor cells are undergoing apoptosis as indicated by the TUNEL assay. The enclosed apoptotic cells are engulfed and then phagocytosed by reticular cells. Our results suggest that reticular cells are related to the differentiation and apoptosis of hemopoietic stem cells.

  14. Isohaline position as a habitat indicator for estuarine populations

    USGS Publications Warehouse

    Jassby, Alan D.; Kimmerer, W.J.; Monismith, Stephen G.; Armor, C.; Cloern, James E.; Powell, T.M.; Vedlinski, Timothy J.

    1995-01-01

    The striped bass survival data were also used to illustrate a related important point: incorporating additionalexplanatory variables may decrease the prediction error for a population or process, but it can increase theuncertainty in parameter estimates and management strategies based on these estimates. Even in cases wherethe uncertainty is currently too large to guide management decisions, an uncertainty analysis can identify themost practical direction for future data acquisition.

  15. Projecting future precipitation and temperature at sites with diverse climate through multiple statistical downscaling schemes

    NASA Astrophysics Data System (ADS)

    Vallam, P.; Qin, X. S.

    2017-10-01

    Anthropogenic-driven climate change would affect the global ecosystem and is becoming a world-wide concern. Numerous studies have been undertaken to determine the future trends of meteorological variables at different scales. Despite these studies, there remains significant uncertainty in the prediction of future climates. To examine the uncertainty arising from using different schemes to downscale the meteorological variables for the future horizons, projections from different statistical downscaling schemes were examined. These schemes included statistical downscaling method (SDSM), change factor incorporated with LARS-WG, and bias corrected disaggregation (BCD) method. Global circulation models (GCMs) based on CMIP3 (HadCM3) and CMIP5 (CanESM2) were utilized to perturb the changes in the future climate. Five study sites (i.e., Alice Springs, Edmonton, Frankfurt, Miami, and Singapore) with diverse climatic conditions were chosen for examining the spatial variability of applying various statistical downscaling schemes. The study results indicated that the regions experiencing heavy precipitation intensities were most likely to demonstrate the divergence between the predictions from various statistical downscaling methods. Also, the variance computed in projecting the weather extremes indicated the uncertainty derived from selection of downscaling tools and climate models. This study could help gain an improved understanding about the features of different downscaling approaches and the overall downscaling uncertainty.

  16. A Measurement of the proton-proton inelastic scattering cross-section at center off mass energy = 7 TeV with the ATLAS detector at the LHC

    NASA Astrophysics Data System (ADS)

    Tompkins, Lauren Alexandra

    The first measurement of the inelastic cross-section for proton-proton collisions at a center of mass energy of 7 TeV using the ATLAS detector at the Large Hadron Collider is presented. From a dataset corresponding to an integrated luminosity of 20 inverse microbarns, events are selected by requiring activity in scintillation counters mounted in the forward region of the ATLAS detector. An inelastic cross-section of 60.1 +/- 2.1 millibarns is measured for the subset of events visible to the scintillation counters. The uncertainty includes the statistical and systematic uncertainty on the measurement. The visible events satisfy xi > 5 x 10 -6, where xi=MX 2/s is calculated from the invariant mass, MX, of hadrons selected using the largest rapidity gap in the event. For diffractive events this corresponds to requiring at least one of the dissociation masses to be larger than 15.7~GeV. Using an extrapolation dependent on the model for the differential diffractive mass distribution, an inelastic cross-section of 69.1 +/- 2.4 (exp) +/- 6.9 (extr) millibarns is determined, where (exp) indicates the experimental uncertainties and (extr) indicates the uncertainty due to the extrapolation from the limited xi-range to the full inelastic cross-section.

  17. Range of earth structure nonuniqueness implied by body wave observations.

    NASA Technical Reports Server (NTRS)

    Wiggins, R. A.; Mcmechan, G. A.; Toksoz, M. N.

    1973-01-01

    The Herglotz-Wiechert integral for the direct inversion of ray parameter versus distance curves can be manipulated to find the envelope of all possible models consistent with geometrical body wave observations (travel time and ray parameter versus distance). Such an extremal inversion approach has been used to find the uncertainty bounds for the velocity structure in the mantle and core. It is found, for example, that there is an uncertainty of plus or minus 40 km in the radius of the inner core boundary, plus or minus 18 km at the core-mantle boundary, and plus or minus 35 km at the 435-km transition zone. The velocity uncertainty is about plus or minus 0.08 km/sec for P and S waves in the lower mantle and about plus or minus 0.20 km/sec in the core. Experiments with various combinations of ray types in the core indicate that rather crude observations of SKKS-SKS travel times confine the range of possible models far more dramatically than do the most precise estimates of PmKP travel times. Comparisons of results from extremal inversion and linearized perturbation inversions indicate that body wave behavior is too strongly nonlinear for linearized schemes to be effective for predicting uncertainty.

  18. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  19. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  20. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  1. Uncertainties in climate change projections for viticulture in Portugal

    NASA Astrophysics Data System (ADS)

    Fraga, Helder; Malheiro, Aureliano C.; Moutinho-Pereira, José; Pinto, Joaquim G.; Santos, João A.

    2013-04-01

    The assessment of climate change impacts on viticulture is often carried out using regional climate model (RCM) outputs. These studies rely on either multi-model ensembles or on single-model approaches. The RCM-ensembles account for uncertainties inherent to the different models. In this study, using a 16-RCM ensemble under the IPCC A1B scenario, the climate change signal (future minus recent-past, 2041-2070 - 1961-2000) of 4 bioclimatic indices (Huglin Index - HI, Dryness Index - DI, Hydrothermal Index - HyI and CompI - Composite Index) over mainland Portugal is analysed. A normalized interquartile range (NIQR) of the 16-member ensemble for each bioclimatic index is assessed in order to quantify the ensemble uncertainty. The results show significant increases in the HI index over most of Portugal, with higher values in Alentejo, Trás-os-Montes and Douro/Porto wine regions, also depicting very low uncertainty. Conversely, the decreases in the DI pattern throughout the country show large uncertainties, except in Minho (northwestern Portugal), where precipitation reaches the highest amounts in Portugal. The HyI shows significant decreases in northwestern Portugal, with relatively low uncertainty all across the country. The CompI depicts significant decreases over Alentejo and increases over Minho, though decreases over Alentejo reveal high uncertainty, while increases over Minho show low uncertainty. The assessment of the uncertainty in climate change projections is of great relevance for the wine industry. Quantifying this uncertainty is crucial, since different models may lead to quite different outcomes and may thereby be as crucial as climate change itself to the winemaking sector. This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692.

  2. Quantifying geological uncertainty for flow and transport modeling in multi-modal heterogeneous formations

    NASA Astrophysics Data System (ADS)

    Feyen, Luc; Caers, Jef

    2006-06-01

    In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.

  3. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabharwall, Piyush; Skifton, Richard; Stoots, Carl

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart ofmore » any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.« less

  4. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  5. Examining the specific dimensions of distress tolerance that prospectively predict perceived stress.

    PubMed

    Bardeen, Joseph R; Fergus, Thomas A; Orcutt, Holly K

    2017-04-01

    We examined five dimensions of distress tolerance (i.e. uncertainty, ambiguity, frustration, negative emotion, physical discomfort) as prospective predictors of perceived stress. Undergraduate students (N = 135) completed self-report questionnaires over the course of two assessment sessions (T1 and T2). Results of a linear regression in which the five dimensions of distress tolerance and covariates (i.e. T1 perceived stress, duration between T1 and T2) served as predictor variables and T2 perceived stress served as the outcome variable showed that intolerance of uncertainty was the only dimension of distress tolerance to predict T2 perceived stress. To better understand this prospective association, we conducted a post hoc analysis simultaneously regressing two subdimensions of intolerance of uncertainty on T2 perceived stress. The subdimension representing beliefs that "uncertainty has negative behavioral and self-referent implications" significantly predicted T2 perceived stress, while the subdimension indicating that "uncertainty is unfair and spoils everything" did not. Results support a growing body of research suggesting intolerance of uncertainty as a risk factor for a wide variety of maladaptive psychological outcomes. Clinical implications will be discussed.

  6. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  7. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  8. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  9. Maximum warming occurs about one decade after a carbon dioxide emission

    NASA Astrophysics Data System (ADS)

    Ricke, Katharine L.; Caldeira, Ken

    2014-12-01

    It is known that carbon dioxide emissions cause the Earth to warm, but no previous study has focused on examining how long it takes to reach maximum warming following a particular CO2 emission. Using conjoined results of carbon-cycle and physical-climate model intercomparison projects (Taylor et al 2012, Joos et al 2013), we find the median time between an emission and maximum warming is 10.1 years, with a 90% probability range of 6.6-30.7 years. We evaluate uncertainties in timing and amount of warming, partitioning them into three contributing factors: carbon cycle, climate sensitivity and ocean thermal inertia. If uncertainty in any one factor is reduced to zero without reducing uncertainty in the other factors, the majority of overall uncertainty remains. Thus, narrowing uncertainty in century-scale warming depends on narrowing uncertainty in all contributing factors. Our results indicate that benefit from avoided climate damage from avoided CO2 emissions will be manifested within the lifetimes of people who acted to avoid that emission. While such avoidance could be expected to benefit future generations, there is potential for emissions avoidance to provide substantial benefit to current generations.

  10. Hydrologic indices for nontidal wetlands

    USGS Publications Warehouse

    Lent, Robert M.; Weiskel, Peter K.; Lyford, Forest P.; Armstrong, David S.

    1997-01-01

    Two sets of hydrologic indices were developed to characterize the water-budget components of nontidal wetlands. The first set consisted of six water-budget indices for input and output variables, and the second set consisted of two hydrologic interaction indices derived from the water-budget indices. The indices then were applied to 19 wetlands with previously published water-budget data. Two trilinear diagrams for each wetland were constructed, one for the three input indices and another for the three output indices. These two trilinear diagrams then were combined with a central quadrangle to form a Piper-type diagram, with data points from the trilinear diagrams projected onto the quadrangle. The quadrangle then was divided into nine fields that summarized the water-budget information. Two quantitative "interaction indices" were calculated from two of the six water-budget indices (precipitation and evapotranspiration). They also were obtained graphically from the water-budget indices, which were first projected to the central quadrangle of a Piper-type diagram from the flanking trilinear plots. The first interaction index (l) defines the strength of interaction between a wetland and the surrounding ground- and surface-water system. The second interaction index (S) defines the nature of the interaction between the wetland and the surrounding ground- and surface-water system (source versus sink). Evaluation of these indices using published wetland water-budget data illustrates the usefulness of the technique.

  11. Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates

    EPA Science Inventory

    Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...

  12. Capturing the complexity of uncertainty language to maximise its use.

    NASA Astrophysics Data System (ADS)

    Juanchich, Marie; Sirota, Miroslav

    2016-04-01

    Uncertainty is often communicated verbally, using uncertainty phrases such as 'there is a small risk of earthquake', 'flooding is possible' or 'it is very likely the sea level will rise'. Prior research has only examined a limited number of properties of uncertainty phrases: mainly the probability conveyed (e.g., 'a small chance' convey a small probability whereas 'it is likely' convey a high probability). We propose a new analytical framework that captures more of the complexity of uncertainty phrases by studying their semantic, pragmatic and syntactic properties. Further, we argue that the complexity of uncertainty phrases is functional and can be leveraged to best describe uncertain outcomes and achieve the goals of speakers. We will present findings from a corpus study and an experiment where we assessed the following properties of uncertainty phrases: probability conveyed, subjectivity, valence, nature of the subject, grammatical category of the uncertainty quantifier and whether the quantifier elicits a positive or a negative framing. Natural language processing techniques applied to corpus data showed that people use a very large variety of uncertainty phrases representing different configurations of the properties of uncertainty phrases (e.g., phrases that convey different levels of subjectivity, phrases with different grammatical construction). In addition, the corpus analysis uncovered that uncertainty phrases commonly studied in psychology are not the most commonly used in real life. In the experiment we manipulated the amount of evidence indicating that a fact was true and whether the participant was required to prove the fact was true or that it was false. Participants produced a phrase to communicate the likelihood that the fact was true (e.g., 'it is not sure…', 'I am convinced that…'). The analyses of the uncertainty phrases produced showed that participants leveraged the properties of uncertainty phrases to reflect the strength of evidence but also to achieve their personal goals. For example, participants aiming to prove that the fact was true chose words that conveyed a more positive polarity and a higher probability than participants aiming to prove that the fact was false. We discuss the utility of the framework for harnessing the properties of uncertainty phrases in geosciences.

  13. A geostatistical approach for quantification of contaminant mass discharge uncertainty using multilevel sampler measurements

    NASA Astrophysics Data System (ADS)

    Li, K. Betty; Goovaerts, Pierre; Abriola, Linda M.

    2007-06-01

    Contaminant mass discharge across a control plane downstream of a dense nonaqueous phase liquid (DNAPL) source zone has great potential to serve as a metric for the assessment of the effectiveness of source zone treatment technologies and for the development of risk-based source-plume remediation strategies. However, too often the uncertainty of mass discharge estimated in the field is not accounted for in the analysis. In this paper, a geostatistical approach is proposed to estimate mass discharge and to quantify its associated uncertainty using multilevel transect measurements of contaminant concentration (C) and hydraulic conductivity (K). The approach adapts the p-field simulation algorithm to propagate and upscale the uncertainty of mass discharge from the local uncertainty models of C and K. Application of this methodology to numerically simulated transects shows that, with a regular sampling pattern, geostatistics can provide an accurate model of uncertainty for the transects that are associated with low levels of source mass removal (i.e., transects that have a large percentage of contaminated area). For high levels of mass removal (i.e., transects with a few hot spots and large areas of near-zero concentration), a total sampling area equivalent to 6˜7% of the transect is required to achieve accurate uncertainty modeling. A comparison of the results for different measurement supports indicates that samples taken with longer screen lengths may lead to less accurate models of mass discharge uncertainty. The quantification of mass discharge uncertainty, in the form of a probability distribution, will facilitate risk assessment associated with various remediation strategies.

  14. Predictive uncertainty in auditory sequence processing

    PubMed Central

    Hansen, Niels Chr.; Pearce, Marcus T.

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  15. Effects of Environmental Conditions on an Urban Wetland's Methane Fluxes

    NASA Astrophysics Data System (ADS)

    Naor Azrieli, L.; Morin, T. H.; Bohrer, G.; Schafer, K. V.; Brooker, M.; Mitsch, W. J.

    2013-12-01

    Methane emissions from wetlands are the largest natural source of uncertainty in the global methane (CH4) budget. Wetlands are highly productive ecosystems with a large carbon sequestration potential. While wetlands are a net sink for carbon dioxide, they also release methane, a potent greenhouse gas. To effectively develop wetland management techniques, it is important to properly calculate the carbon budget of wetlands by understand the driving factors of methane fluxes. We constructed an eddy flux covariance system in the Olentangy River Wetland Research Park, a series of created and restored wetland in Columbus Ohio. Through the use of high frequency open path infrared gas analyzer (IRGA) sensors, we have continuously monitored the methane fluxes associated with the wetland since May 2011. To account for the heterogeneous landscape surrounding the tower, a footprint analysis was used to isolate data originating from within the wetland. Continuous measurements of the meteorological and environmental conditions at the wetlands coinciding with the flux measurements allow the interactions between methane fluxes and the climate and ecological forcing to be studied. The wintertime daily cycle of methane peaks around midday indicating a typical diurnal pattern in cold months. In the summer, the peak shifts to earlier in the day and also includes a daily peak occurring at approximately 10 AM. We believe this peak is associated with the onset of photosynthesis in Typha latifolia flushing methane from the plant's air filled tissue. Correlations with methane fluxes include latent heat flux, soil temperature, and incoming radiation. The connection to radiation may be further evidence of plant activity as a driver of methane fluxes. Higher methane fluxes corresponding with higher soil temperature indicates that warmer days stimulate the methanogenic consortium. Further analysis will focus on separating the methane fluxes into emissions from different terrain types within the wetland.

  16. A Web-based Tool to Aid the Identification of Chemicals Potentially Posing a Health Risk through Percutaneous Exposure.

    PubMed

    Gorman Ng, Melanie; Milon, Antoine; Vernez, David; Lavoué, Jérôme

    2016-04-01

    Occupational hygiene practitioners typically assess the risk posed by occupational exposure by comparing exposure measurements to regulatory occupational exposure limits (OELs). In most jurisdictions, OELs are only available for exposure by the inhalation pathway. Skin notations are used to indicate substances for which dermal exposure may lead to health effects. However, these notations are either present or absent and provide no indication of acceptable levels of exposure. Furthermore, the methodology and framework for assigning skin notation differ widely across jurisdictions resulting in inconsistencies in the substances that carry notations. The UPERCUT tool was developed in response to these limitations. It helps occupational health stakeholders to assess the hazard associated with dermal exposure to chemicals. UPERCUT integrates dermal quantitative structure-activity relationships (QSARs) and toxicological data to provide users with a skin hazard index called the dermal hazard ratio (DHR) for the substance and scenario of interest. The DHR is the ratio between the estimated 'received' dose and the 'acceptable' dose. The 'received' dose is estimated using physico-chemical data and information on the exposure scenario provided by the user (body parts exposure and exposure duration), and the 'acceptable' dose is estimated using inhalation OELs and toxicological data. The uncertainty surrounding the DHR is estimated with Monte Carlo simulation. Additional information on the selected substances includes intrinsic skin permeation potential of the substance and the existence of skin notations. UPERCUT is the only available tool that estimates the absorbed dose and compares this to an acceptable dose. In the absence of dermal OELs it provides a systematic and simple approach for screening dermal exposure scenarios for 1686 substances. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  17. Biological Responses of the Coral Montastraea annularis to the Removal of Filamentous Turf Algae

    PubMed Central

    Cetz-Navarro, Neidy P.; Espinoza-Avalos, Julio; Hernández-Arana, Héctor A.; Carricart-Ganivet, Juan P.

    2013-01-01

    Coral reef degradation increases coral interactions with filamentous turf algae (FTA) and macroalgae, which may result in chronic stress for the corals. We evaluated the effects of short (2.5 month) and long (10 month) periods of FTA removal on tissue thickness (TT), zooxanthellae density (ZD), mitotic index (MI), and concentration of chlorophyll a (Chl a) in Montastraea annularis at the beginning and end of gametogenesis. Ramets (individual lobes within a colony) consistently surrounded by FTA and ramets surrounded by crustose coralline algae (CCA) were used as controls. FTA removal reduced coral stress, indicated by increased TT and ZD and lower MI. The measured effects were similar in magnitude for the short and long periods of algal removal. Ramets were more stressed at the end of gametogenesis compared with the beginning, with lower ZD and Chl a cm−2, and higher MI. However, it was not possible to distinguish the stress caused by the presence of FTA from that caused by seasonal changes in seawater temperature. Ramets surrounded by CCA showed less stress in comparison with ramets surrounded by FTA: with higher TT, Chl a cm−2 and ZD, and lower MI values. Coral responses indicated that ramets with FTA suffered the most deleterious effects and contrasted with those measured in ramets surrounded by CCA. According to published studies and our observations, there could be at least six mechanisms associated to FTA in the stress caused to M. annularis by FTA. Owing to the high cover of FTA (in contrast to macroalgae and CCA) in the Caribbean, the chronic stress, the overgrowth and mortality that this functional algal group can cause on M. annularis species complex, a further decline of this important reef-building coral in the Caribbean is expected. PMID:23372774

  18. Degradation of Potassium Rock by Earthworms and Responses of Bacterial Communities in Its Gut and Surrounding Substrates after Being Fed with Mineral

    PubMed Central

    Liu, Dianfeng; Lian, Bin; Wang, Bin; Jiang, Guofang

    2011-01-01

    Background Earthworms are an ecosystem's engineers, contributing to a wide range of nutrient cycling and geochemical processes in the ecosystem. Their activities can increase rates of silicate mineral weathering. Their intestinal microbes usually are thought to be one of the key drivers of mineral degradation mediated by earthworms,but the diversities of the intestinal microorganisms which were relevant with mineral weathering are unclear. Methodology/Principal Findings In this report, we show earthworms' effect on silicate mineral weathering and the responses of bacterial communities in their gut and surrounding substrates after being fed with potassium-bearing rock powder (PBRP). Determination of water-soluble and HNO3-extractable elements indicated some elements such as Al, Fe and Ca were significantly released from mineral upon the digestion of earthworms. The microbial communities in earthworms' gut and the surrounding substrates were investigated by amplified ribosomal DNA restriction analysis (ARDRA) and the results showed a higher bacterial diversity in the guts of the earthworms fed with PBRP and the PBRP after being fed to earthworms. UPGMA dendrogram with unweighted UniFrac analysis, considering only taxa that are present, revealed that earthworms' gut and their surrounding substrate shared similar microbiota. UPGMA dendrogram with weighted UniFrac, considering the relative abundance of microbial lineages, showed the two samples from surrounding substrate and the two samples from earthworms' gut had similarity in microbial community, respectively. Conclusions/Significance Our results indicated earthworms can accelerate degradation of silicate mineral. Earthworms play an important role in ecosystem processe since they not only have some positive effects on soil structure, but also promote nutrient cycling of ecosystem by enhancing the weathering of minerals. PMID:22174903

  19. A Learning Model for L/M Specificity in Ganglion Cells

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J.

    2016-01-01

    An unsupervised learning model for developing LM specific wiring at the ganglion cell level would support the research indicating LM specific wiring at the ganglion cell level (Reid and Shapley, 2002). Removing the contributions to the surround from cells of the same cone type improves the signal-to-noise ratio of the chromatic signals. The unsupervised learning model used is Hebbian associative learning, which strengthens the surround input connections according to the correlation of the output with the input. Since the surround units of the same cone type as the center are redundant with the center, their weights end up disappearing. This process can be thought of as a general mechanism for eliminating unnecessary cells in the nervous system.

  20. Decision for disclosure: The experiences of Iranian infertile couples undergoing assisted reproductive donation procedures.

    PubMed

    Hadizadeh-Talasaz, Fatemeh; Roudsari, Robab Latifnejad; Simbar, Masoumeh

    2015-01-01

    Controversy surrounding disclosure among the recipients of assisted reproductive donation procedures is escalating worldwide, but little research has been conducted in this topic. The purpose of this qualitative study was to explore the experiences of infertile couples undergoing assisted reproductive donation procedures. In this exploratory qualitative study, 32 patients (nine couples and 14 women) who were candidates to use donor eggs, donor embryos or surrogacy, and 5 members of infertility treatment team including gynaecologists, midwives and psychologist (total 37) were purposively selected from the Montaserieh Infertility Research Centre at Mashhad, Iran in 2012 and interviewed using a semi-structured in-depth method. Data were analysed using conventional qualitative content analysis with MAXqda software. One overarching theme, entitled 'experiencing uncertainty surrounding the disclosure to others' was identified from the data. This theme contained two subthemes including 'Couples' decisions to not disclose to others' and 'Couples' decisions to disclose to others'. Five categories formed the first subtheme, and the second subtheme emerged from four categories which are discussed in this paper. The main reason for secrecy was concern over societal negative views about assisted reproductive donation procedures. This worry deprived the couples from support from family and friends and as a result requires them to tolerate psychological pressure when using such procedures.

Top