Sample records for address scientific uncertainties

  1. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  2. Public perception and communication of scientific uncertainty.

    PubMed

    Broomell, Stephen B; Kane, Patrick Bodilly

    2017-02-01

    Understanding how the public perceives uncertainty in scientific research is fundamental for effective communication about research and its inevitable uncertainty. Previous work found that scientific evidence differentially influenced beliefs from individuals with different political ideologies. Evidence that threatens an individual's political ideology is perceived as more uncertain than nonthreatening evidence. The authors present 3 studies examining perceptions of scientific uncertainty more broadly by including sciences that are not politically polarizing. Study 1 develops scales measuring perceptions of scientific uncertainty. It finds (a) 3 perceptual dimensions of scientific uncertainty, with the primary dimension representing a perception of precision; (b) the precision dimension of uncertainty is strongly associated with the perceived value of a research field; and (c) differences in perceived uncertainty across political affiliations. Study 2 manipulated these dimensions, finding that Republicans were more sensitive than Democrats to descriptions of uncertainty associated with a research field (e.g., psychology). Study 3 found that these views of a research field did not extend to the evaluation of individual results produced by the field. Together, these studies show that perceptions of scientific uncertainty associated with entire research fields are valid predictors of abstract perceptions of scientific quality, benefit, and allocation of funding. Yet, they do not inform judgments about individual results. Therefore, polarization in the acceptance of specific results is not likely due to individual differences in perceived scientific uncertainty. Further, the direction of influence potentially could be reversed, such that perceived quality of scientific results could be used to influence perceptions about scientific research fields. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  4. Communication about scientific uncertainty in environmental nanoparticle research - a comparison of scientific literature and mass media

    NASA Astrophysics Data System (ADS)

    Heidmann, Ilona; Milde, Jutta

    2014-05-01

    The research about the fate and behavior of engineered nanoparticles in the environment is despite its wide applications still in the early stages. 'There is a high level of scientific uncertainty in nanoparticle research' is often stated in the scientific community. Knowledge about these uncertainties might be of interest to other scientists, experts and laymen. But how could these uncertainties be characterized and are they communicated within the scientific literature and the mass media? To answer these questions, the current state of scientific knowledge about scientific uncertainty through the example of environmental nanoparticle research was characterized and the communication of these uncertainties within the scientific literature is compared with its media coverage in the field of nanotechnologies. The scientific uncertainty within the field of environmental fate of nanoparticles is by method uncertainties and a general lack of data concerning the fate and effects of nanoparticles and their mechanisms in the environment, and by the uncertain transferability of results to the environmental system. In the scientific literature, scientific uncertainties, their sources, and consequences are mentioned with different foci and to a different extent. As expected, the authors in research papers focus on the certainty of specific results within their specific research question, whereas in review papers, the uncertainties due to a general lack of data are emphasized and the sources and consequences are discussed in a broader environmental context. In the mass media, nanotechnology is often framed as rather certain and positive aspects and benefits are emphasized. Although reporting about a new technology, only in one-third of the reports scientific uncertainties are mentioned. Scientific uncertainties are most often mentioned together with risk and they arise primarily from unknown harmful effects to human health. Environmental issues itself are seldom mentioned

  5. Addressing uncertainty in vulnerability assessments [Chapter 5

    Treesearch

    Linda Joyce; Molly Cross; Evan Girvatz

    2011-01-01

    This chapter addresses issues and approaches for dealing with uncertainty specifically within the context of conducting climate change vulnerability assessments (i.e., uncertainties related to identifying and modeling the sensitivities, levels of exposure, and adaptive capacity of the assessment targets).

  6. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to

  7. Scientific uncertainty in media content: Introduction to this special issue.

    PubMed

    Peters, Hans Peter; Dunwoody, Sharon

    2016-11-01

    This introduction sets the stage for the special issue on the public communication of scientific uncertainty that follows by sketching the wider landscape of issues related to the communication of uncertainty and showing how the individual contributions fit into that landscape. The first part of the introduction discusses the creation of media content as a process involving journalists, scientific sources, stakeholders, and the responsive audience. The second part then provides an overview of the perception of scientific uncertainty presented by the media and the consequences for the recipients' own assessments of uncertainty. Finally, we briefly describe the six research articles included in this special issue. © The Author(s) 2016.

  8. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  9. Not Normal: the uncertainties of scientific measurements

    PubMed Central

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557

  10. Not Normal: the uncertainties of scientific measurements

    NASA Astrophysics Data System (ADS)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  11. A Practical Approach to Address Uncertainty in Stakeholder Deliberations.

    PubMed

    Gregory, Robin; Keeney, Ralph L

    2017-03-01

    This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.

  12. Scientific Uncertainty and Its Relevance to Science Education

    ERIC Educational Resources Information Center

    Ruggeri, Nancy Lee

    2011-01-01

    Uncertainty is inherent to scientific methods and practices, yet is it rarely explicitly discussed in science classrooms. Ironically, science is often equated with "certainty" in these contexts. Uncertainties that arise in science deserve special attention, as they are increasingly a part of public discussions and are susceptible to manipulation.…

  13. Addressing uncertainty in atomistic machine learning.

    PubMed

    Peterson, Andrew A; Christensen, Rune; Khorshidi, Alireza

    2017-05-10

    Machine-learning regression has been demonstrated to precisely emulate the potential energy and forces that are output from more expensive electronic-structure calculations. However, to predict new regions of the potential energy surface, an assessment must be made of the credibility of the predictions. In this perspective, we address the types of errors that might arise in atomistic machine learning, the unique aspects of atomistic simulations that make machine-learning challenging, and highlight how uncertainty analysis can be used to assess the validity of machine-learning predictions. We suggest this will allow researchers to more fully use machine learning for the routine acceleration of large, high-accuracy, or extended-time simulations. In our demonstrations, we use a bootstrap ensemble of neural network-based calculators, and show that the width of the ensemble can provide an estimate of the uncertainty when the width is comparable to that in the training data. Intriguingly, we also show that the uncertainty can be localized to specific atoms in the simulation, which may offer hints for the generation of training data to strategically improve the machine-learned representation.

  14. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  15. Risk, Scientific Uncertainty, and Policy Implications of Global Climate Change Models

    NASA Astrophysics Data System (ADS)

    Briggs, C.; Sahagian, D.

    2006-12-01

    The risks of global climate change to human populations and natural environments have received increasing attention in recent years. With high-profile events such as hurricane Katrina in the United States, rapid melting of the Greenland ice sheet, shifting precipitation patterns in Europe and elsewhere, more political attention has been given to the risks posed by anthropogenic changes in the earth's atmosphere. Yet despite increasing scientific evidence of such environmental risks, reactions from political sources have been far from consistent. While some states have adopted emissions regulations on greenhouse gases, other states or national governments have downplayed the existence of any significant risk. Explanations for why political actors or the public may appear unaware of scientific data relate to the nature of uncertainty in environmental risk models and decisions. Professional scientific methodologies must approach uncertainty in a far different manner than government agencies or members of the public, and these varying types of uncertainty create spaces for translation of scientific data into incompatible conclusions. Such conclusions depend not only upon the translation of scientific data, but also perception of the risks involved, differential local impacts of climate change, and available policy alternatives and resources. Scientists involved in climate research bear a particular responsibility for how their data are interpreted politically, but this requires awareness of the manners in which uncertainty is employed, the ethics of applying research to policy questions, and realization that risks will be perceived differently according to political cultures and geographic regions.

  16. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open

  17. Addressing submarine geohazards through scientific drilling

    NASA Astrophysics Data System (ADS)

    Camerlenghi, A.

    2009-04-01

    Natural submarine geohazards (earthquakes, volcanic eruptions, landslides, volcanic island flank collapses) are geological phenomena originating at or below the seafloor leading to a situation of risk for off-shore and on-shore structures and the coastal population. Addressing submarine geohazards means understanding their spatial and temporal variability, the pre-conditioning factors, their triggers, and the physical processes that control their evolution. Such scientific endeavour is nowadays considered by a large sector of the international scientific community as an obligation in order to contribute to the mitigation of the potentially destructive societal effects of submarine geohazards. The study of submarine geohazards requires a multi-disciplinary scientific approach: geohazards must be studied through their geological record; active processes must be monitored; geohazard evolution must be modelled. Ultimately, the information must be used for the assessment of vulnerability, risk analysis, and development of mitigation strategies. In contrast with the terrestrial environment, the oceanic environment is rather hostile to widespread and fast application of high-resolution remote sensing techniques, accessibility for visual inspection, sampling and installation of monitoring stations. Scientific Drilling through the IODP (including the related pre site-survey investigations, sampling, logging and in situ measurements capability, and as a platform for deployment of long term observatories at the surface and down-hole) can be viewed as the centre of gravity of an international, coordinated, multi-disciplinary scientific approach to address submarine geohazards. The IODP Initial Science Plan expiring in 2013 does not address openly geohazards among the program scientific objectives. Hazards are referred to mainly in relation to earthquakes and initiatives towards the understanding of seismogenesis. Notably, the only drilling initiative presently under way is the

  18. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    NASA Astrophysics Data System (ADS)

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  19. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has

  20. A review of contemporary methods for the presentation of scientific uncertainty.

    PubMed

    Makinson, K A; Hamby, D M; Edwards, J A

    2012-12-01

    Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.

  1. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.

  2. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter

  3. Measuring the perceived uncertainty of scientific evidence and its relationship to engagement with science.

    PubMed

    Retzbach, Joachim; Otto, Lukas; Maier, Michaela

    2016-08-01

    Many scholars have argued for the need to communicate openly not only scientific successes to the public but also limitations, such as the tentativeness of research findings, in order to enhance public trust and engagement. Yet, it has not been quantitatively assessed how the perception of scientific uncertainties relates to engagement with science on an individual level. In this article, we report the development and testing of a new questionnaire in English and German measuring the perceived uncertainty of scientific evidence. Results indicate that the scale is reliable and valid in both language versions and that its two subscales are differentially related to measures of engagement: Science-friendly attitudes were positively related only to 'subjectively' perceived uncertainty, whereas interest in science as well as behavioural engagement actions and intentions were largely uncorrelated. We conclude that perceiving scientific knowledge to be uncertain is only weakly, but positively related to engagement with science. © The Author(s) 2015.

  4. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.

  5. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  6. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information.

    PubMed

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople's understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should

  7. Addressing Unconscious Bias: Steps toward an Inclusive Scientific Culture

    NASA Astrophysics Data System (ADS)

    Stewart, Abigail

    2011-01-01

    In this talk I will outline the nature of unconscious bias, as it operates to exclude or marginalize some participants in the scientific community. I will show how bias results from non-conscious expectations about certain groups of people, including scientists and astronomers. I will outline scientific research in psychology, sociology and economics that has identified the impact these expectations have on interpersonal judgments that are at the heart of assessment of individuals' qualifications. This research helps us understand not only how bias operates within a single instance of evaluation, but how evaluation bias can accumulate over a career if not checked, creating an appearance of confirmation of biased expectations. Some research has focused on how best to interrupt and mitigate unconscious bias, and many institutions--including the University of Michigan--have identified strategic interventions at key points of institutional decision-making (particularly hiring, annual review, and promotion) that can make a difference. The NSF ADVANCE Institutional Transformation program encouraged institutions to draw on the social science literature to create experimental approaches to addressing unconscious bias. I will outline four approaches to intervention that have arisen through the ADVANCE program: (1) systematic education that increases awareness among decisionmakers of how evaluation bias operates; (2) development of practices that mitigate the operation of bias even when it is out of conscious awareness; (3) creation of institutional policies that routinize and sanction these practices; and (4) holding leaders accountable for these implementation of these new practices and policies. Although I will focus on ways to address unconscious bias within scientific institutions (colleges and universities, laboratories and research centers, etc.), I will close by considering how scientific organizations can address unconscious bias and contribute to creating an

  8. Addressing Uncertainty in the ISCORS Multimedia Radiological Dose Assessment of Municipal Sewage Sludge and Ash

    NASA Astrophysics Data System (ADS)

    Chiu, W. A.; Bachmaier, J.; Bastian, R.; Hogan, R.; Lenhart, T.; Schmidt, D.; Wolbarst, A.; Wood, R.; Yu, C.

    2002-05-01

    Managing municipal wastewater at publicly owned treatment works (POTWs) leads to the production of considerable amounts of residual solid material, which is known as sewage sludge or biosolids. If the wastewater entering a POTW contains radioactive material, then the treatment process may concentrate radionuclides in the sludge, leading to possible exposure of the general public or the POTW workers. The Sewage Sludge Subcommittee of the Interagency Steering Committee on Radiation Standards (ISCORS), which consists of representatives from the Environmental Protection Agency, the Nuclear Regulatory Commission, the Department of Energy, and several other federal, state, and local agencies, is developing guidance for POTWs on the management of sewage sludge that may contain radioactive materials. As part of this effort, they are conducting an assessment of potential radiation exposures using the Department of Energy's RESidual RADioactivity (RESRAD) family of computer codes developed by Argonne National Laboratory. This poster describes several approaches used by the Subcommittee to address the uncertainties associated with their assessment. For instance, uncertainties in the source term are addressed through a combination of analytic and deterministic computer code calculations. Uncertainties in the exposure pathways are addressed through the specification of a number of hypothetical scenarios, some of which can be scaled to address changes in exposure parameters. In addition, the uncertainty in some physical and behavioral parameters are addressed through probabilistic methods.

  9. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  10. Exploring Scientific Information for Policy Making under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Forni, L.; Galaitsi, S.; Mehta, V. K.; Escobar, M.; Purkey, D. R.; Depsky, N. J.; Lima, N. A.

    2016-12-01

    Each actor evaluating potential management strategies brings her/his own distinct set of objectives to a complex decision space of system uncertainties. The diversity of these objectives require detailed and rigorous analyses that responds to multifaceted challenges. However, the utility of this information depends on the accessibility of scientific information to decision makers. This paper demonstrates data visualization tools for presenting scientific results to decision makers in two case studies, La Paz/ El Alto, Bolivia, and Yuba County,California. Visualization output from the case studies combines spatiotemporal, multivariate and multirun/multiscenario information to produce information corresponding to the objectives defined by key actors and stakeholders. These tools can manage complex data and distill scientific information into accessible formats. Using the visualizations, scientists and decision makers can navigate the decision space and potential objective trade-offs to facilitate discussion and consensus building. These efforts can support identifying stable negotiatedagreements between different stakeholders.

  11. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information

    PubMed Central

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople’s understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should

  12. Provider Recommendations in the Face of Scientific Uncertainty: An Analysis of Audio-Recorded Discussions about Vitamin D.

    PubMed

    Tarn, Derjung M; Paterniti, Debora A; Wenger, Neil S

    2016-08-01

    Little is known about how providers communicate recommendations when scientific uncertainty exists. To compare provider recommendations to those in the scientific literature, with a focus on whether uncertainty was communicated. Qualitative (inductive systematic content analysis) and quantitative analysis of previously collected audio-recorded provider-patient office visits. Sixty-one providers and a socio-economically diverse convenience sample of 603 of their patients from outpatient community- and academic-based primary care, integrative medicine, and complementary and alternative medicine provider offices in Southern California. Comparison of provider information-giving about vitamin D to professional guidelines and scientific information for which conflicting recommendations or insufficient scientific evidence exists; certainty with which information was conveyed. Ninety-two (15.3 %) of 603 visit discussions touched upon issues related to vitamin D testing, management and benefits. Vitamin D deficiency screening was discussed with 23 (25 %) patients, the definition of vitamin D deficiency with 21 (22.8 %), the optimal range for vitamin D levels with 26 (28.3 %), vitamin D supplementation dosing with 50 (54.3 %), and benefits of supplementation with 46 (50 %). For each of the professional guidelines/scientific information examined, providers conveyed information that deviated from professional guidelines and the existing scientific evidence. Of 166 statements made about vitamin D in this study, providers conveyed 160 (96.4 %) with certainty, without mention of any equivocal or contradictory evidence in the scientific literature. No uncertainty was mentioned when vitamin D dosing was discussed, even when recommended dosing was higher than guideline recommendations. Providers convey the vast majority of information and recommendations about vitamin D with certainty, even though the scientific literature contains inconsistent recommendations and

  13. Addressing location uncertainties in GPS-based activity monitoring: A methodological framework

    PubMed Central

    Wan, Neng; Lin, Ge; Wilson, Gaines J.

    2016-01-01

    Location uncertainty has been a major barrier in information mining from location data. Although the development of electronic and telecommunication equipment has led to an increased amount and refined resolution of data about individuals’ spatio-temporal trajectories, the potential of such data, especially in the context of environmental health studies, has not been fully realized due to the lack of methodology that addresses location uncertainties. This article describes a methodological framework for deriving information about people’s continuous activities from individual-collected Global Positioning System (GPS) data, which is vital for a variety of environmental health studies. This framework is composed of two major methods that address critical issues at different stages of GPS data processing: (1) a fuzzy classification method for distinguishing activity patterns; and (2) a scale-adaptive method for refining activity locations and outdoor/indoor environments. Evaluation of this framework based on smartphone-collected GPS data indicates that it is robust to location errors and is able to generate useful information about individuals’ life trajectories. PMID:28943777

  14. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false How will NIOSH address uncertainty about dose levels? 82.19 Section 82.19 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER...

  15. Do systematic reviews address community healthcare professionals' wound care uncertainties? Results from evidence mapping in wound care.

    PubMed

    Christie, Janice; Gray, Trish A; Dumville, Jo C; Cullum, Nicky A

    2018-01-01

    Complex wounds such as leg and foot ulcers are common, resource intensive and have negative impacts on patients' wellbeing. Evidence-based decision-making, substantiated by high quality evidence such as from systematic reviews, is widely advocated for improving patient care and healthcare efficiency. Consequently, we set out to classify and map the extent to which up-to-date systematic reviews containing robust evidence exist for wound care uncertainties prioritised by community-based healthcare professionals. We asked healthcare professionals to prioritise uncertainties based on complex wound care decisions, and then classified 28 uncertainties according to the type and level of decision. For each uncertainty, we searched for relevant systematic reviews. Two independent reviewers screened abstracts and full texts of reviews against the following criteria: meeting an a priori definition of a systematic review, sufficiently addressing the uncertainty, published during or after 2012, and identifying high quality research evidence. The most common uncertainty type was 'interventions' 24/28 (85%); the majority concerned wound level decisions 15/28 (53%) however, service delivery level decisions (10/28) were given highest priority. Overall, we found 162 potentially relevant reviews of which 57 (35%) were not systematic reviews. Of 106 systematic reviews, only 28 were relevant to an uncertainty and 18 of these were published within the preceding five years; none identified high quality research evidence. Despite the growing volume of published primary research, healthcare professionals delivering wound care have important clinical uncertainties which are not addressed by up-to-date systematic reviews containing high certainty evidence. These are high priority topics requiring new research and systematic reviews which are regularly updated. To reduce clinical and research waste, we recommend systematic reviewers and researchers make greater efforts to ensure that research

  16. Socioeconomic Implications of Achieving 2.0 °C and 1.5 °C Climate Targets under Scientific Uncertainties

    NASA Astrophysics Data System (ADS)

    Su, X.; Takahashi, K.; Fujimori, S.; Hasegawa, T.; Tanaka, K.; Shiogama, H.; Emori, S.; LIU, J.; Hanasaki, N.; Hijioka, Y.; Masui, T.

    2017-12-01

    Large uncertainty exists in the temperature projections, including contributions from carbon cycle, climate system and aerosols. For the integrated assessment models (IAMs), like DICE, FUND and PAGE, however, the scientific uncertainties mainly rely on the distribution of (equilibrium) climate sensitivity. This study aims at evaluating the emission pathways by limiting temperature increase below 2.0 ºC or 1.5 ºC after 2100 considering scientific uncertainties, and exploring how socioeconomic indicators are affected by such scientific uncertainties. We use a stochastic version of the SCM4OPT, with an uncertainty measurement by considering alternative ranges of key parameters. Three climate cases, namely, i) base case of SSP2, ii) limiting temperature increase below 2.0 ºC after 2100 and iii) limiting temperature increase below 1.5 ºC after 2100, and three types of probabilities - i) >66% probability or likely, ii) >50% probability or more likely than not and iii) the mean of the probability distribution, are considered in the study. The results show that, i) for the 2.0ºC case, the likely CO2 reduction rate in 2100 ranges from 75.5%-102.4%, with mean value of 88.1%, and 93.0%-113.1% (mean 102.5%) for the 1.5ºC case; ii) a likely range of forcing effect is found for the 2.0 ºC case (2.7-3.9 Wm-2) due to scientific uncertainty, and 1.9-3.1 Wm-2 for the 1.5 ºC case; iii) the carbon prices within 50% confidential interval may differ a factor of 3 for both the 2.0ºC case and the 1.5 ºC case; iv) the abatement costs within 50% confidential interval may differ a factor of 4 for both the 2.0ºC case and the 1.5 ºC case. Nine C4MIP carbon cycle models and nineteen CMIP3 AOGCMs are used to account for the scientific uncertainties, following MAGICC 6.0. These uncertainties will result in a likely radiative forcing range of 6.1-7.5 Wm-2 and a likely temperature increase of 3.1-4.5 ºC in 2100 for the base case of SSP2. If we evaluate the 2 ºC target by limiting the

  17. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS II: SENSITIVITY ANALYSIS IN THE CONUS DOMAIN

    EPA Science Inventory

    In this study, we present the response of model results to different scientific treatments in an effort to quantify the uncertainties caused by the incomplete understanding of mercury science and by model assumptions in atmospheric mercury models. Two sets of sensitivity simulati...

  18. Scientific Uncertainty in News Coverage of Cancer Research: Effects of Hedging on Scientists' and Journalists' Credibility

    ERIC Educational Resources Information Center

    Jensen, Jakob D.

    2008-01-01

    News reports of scientific research are rarely hedged; in other words, the reports do not contain caveats, limitations, or other indicators of scientific uncertainty. Some have suggested that hedging may influence news consumers' perceptions of scientists' and journalists' credibility (perceptions that may be related to support for scientific…

  19. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    PubMed

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  20. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    NASA Astrophysics Data System (ADS)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  1. Policy decision-making under scientific uncertainty: radiological risk assessment and the role of expert advisory groups.

    PubMed

    Mossman, Kenneth L

    2009-08-01

    Standard-setting agencies such as the U.S. Nuclear Regulatory Commission and the U.S. Environmental Protection Agency depend on advice from external expert advisory groups on matters of public policy and standard-setting. Authoritative bodies including the National Research Council and the National Council on Radiation Protection and Measurements provide analyses and recommendations that enable the technical and scientific soundness in decision-making. In radiological protection the nature of the scientific evidence is such that risk assessment at radiation doses typically encountered in environmental and occupational settings is highly uncertain, and several policy alternatives are scientifically defensible. The link between science and policy is problematic. The fundamental issue is the failure to properly consider risk assessment, risk communication, and risk management and then consolidate them in a process that leads to sound policy. Authoritative bodies should serve as unbiased brokers of policy choices by providing balanced and objective scientific analyses. As long as the policy-decision environment is characterized by high scientific uncertainty and a lack of values consensus, advisory groups should present unbiased evaluations of all scientifically plausible alternatives and recommend selection criteria that decision makers can use in the policy-setting process. To do otherwise (e.g., by serving as single position advocates) weakens decision-making by eliminating options and narrowing discussions of scientific perspectives. Understanding uncertainties and the limitations on available scientific information and conveying such information to policy makers remain key challenges for the technical and policy communities.

  2. Optimal regeneration planning for old-growth forest: addressing scientific uncertainty in endangered species recovery through adaptive management

    USGS Publications Warehouse

    Moore, C.T.; Conroy, M.J.

    2006-01-01

    Stochastic and structural uncertainties about forest dynamics present challenges in the management of ephemeral habitat conditions for endangered forest species. Maintaining critical foraging and breeding habitat for the endangered red-cockaded woodpecker (Picoides borealis) requires an uninterrupted supply of old-growth forest. We constructed and optimized a dynamic forest growth model for the Piedmont National Wildlife Refuge (Georgia, USA) with the objective of perpetuating a maximum stream of old-growth forest habitat. Our model accommodates stochastic disturbances and hardwood succession rates, and uncertainty about model structure. We produced a regeneration policy that was indexed by current forest state and by current weight of evidence among alternative model forms. We used adaptive stochastic dynamic programming, which anticipates that model probabilities, as well as forest states, may change through time, with consequent evolution of the optimal decision for any given forest state. In light of considerable uncertainty about forest dynamics, we analyzed a set of competing models incorporating extreme, but plausible, parameter values. Under any of these models, forest silviculture practices currently recommended for the creation of woodpecker habitat are suboptimal. We endorse fully adaptive approaches to the management of endangered species habitats in which predictive modeling, monitoring, and assessment are tightly linked.

  3. Addressing uncertainty in adaptation planning for agriculture.

    PubMed

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  4. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  5. Addressing scientific literacy through content area reading and processes of scientific inquiry: What teachers report

    NASA Astrophysics Data System (ADS)

    Cooper, Susan J.

    The purpose of this study was to interpret the experiences of secondary science teachers in Florida as they address the scientific literacy of their students through teaching content reading strategies and student inquiry skills. Knowledge of the successful integration of content reading and inquiry skills by experienced classroom teachers would be useful to many educators as they plan instruction to achieve challenging state and national standards for reading as well as science. The problem was investigated using grounded theory methodology. Open-ended questions were asked in three focus groups and six individual interviews that included teachers from various Florida school districts. The constant comparative approach was used to analyze the data. Initial codes were collapsed into categories to determine the conceptual relationships among the data. From this, the five core categories were determined to be Influencers, Issues, Perceptions, Class Routines, and Future Needs. These relate to the central phenomenon, Instructional Modifications, because teachers often described pragmatic and philosophical changes in their teaching as they deliberated to meet state standards in both reading and science. Although Florida's secondary science teachers have been asked to incorporate content reading strategies into their science instruction for the past several years, there was limited evidence of using these strategies to further student understanding of scientific processes. Most teachers saw little connection between reading and inquiry, other than the fact that students must know how to read to follow directions in the lab. Scientific literacy, when it was addressed by teachers, was approached mainly through class discussions, not reading. Teachers realized that students cannot learn secondary science content unless they read science text with comprehension; therefore the focus of reading instruction was on learning science content, not scientific literacy or student

  6. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  7. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  8. Scientific rationality, uncertainty and the governance of human genetics: an interview study with researchers at deCODE genetics.

    PubMed

    Hjörleifsson, Stefán; Schei, Edvin

    2006-07-01

    Technology development in human genetics is fraught with uncertainty, controversy and unresolved moral issues, and industry scientists are sometimes accused of neglecting the implications of their work. The present study was carried out to elicit industry scientists' reflections on the relationship between commercial, scientific and ethical dimensions of present day genetics and the resources needed for robust governance of new technologies. Interviewing scientists of the company deCODE genetics in Iceland, we found that in spite of optimism, the informants revealed ambiguity and uncertainty concerning the use of human genetic technologies for the prevention of common diseases. They concurred that uncritical marketing of scientific success might cause exaggerated public expectations of health benefits from genetics, with the risk of backfiring and causing resistance to genetics in the population. On the other hand, the scientists did not address dilemmas arising from the commercial nature of their own employer. Although the scientists tended to describe public fear as irrational, they identified issues where scepticism might be well founded and explored examples where they, despite expert knowledge, held ambiguous or tentative personal views on the use of predictive genetic technologies. The rationality of science was not seen as sufficient to ensure beneficial governance of new technologies. The reflexivity and suspension of judgement demonstrated in the interviews exemplify productive features of moral deliberation in complex situations. Scientists should take part in dialogues concerning the governance of genetic technologies, acknowledge any vested interests, and use their expertise to highlight, not conceal the technical and moral complexity involved.

  9. Painting the world REDD: addressing scientific barriers to monitoring emissions from tropical forests

    NASA Astrophysics Data System (ADS)

    Asner, Gregory P.

    2011-06-01

    project scale to program readiness is a big step for all involved, and many are finding that it is not easy. Current barriers to national monitoring of forest carbon stocks and emissions range from technical to scientific, and from institutional to operational. In fact, a recent analysis suggested that about 3% of tropical countries currently have the capacity to monitor and report on changes in forest cover and carbon stocks (Herold 2009). But until now, the scientific and policy-development communities have had little quantitative information on exactly which aspects of national-scale monitoring are most uncertain, and how that uncertainty will affect REDD+ performance reporting. A new and remarkable study by Pelletier, Ramankutty and Potvin (2011) uses an integrated, spatially-explicit modeling technique to explore and quantify sources of uncertainty in carbon emissions mapping throughout the Republic of Panama. Their findings are sobering: deforestation rates would need to be reduced by a full 50% in Panama in order to be detectable above the statistical uncertainty caused by several current major monitoring problems. The number one uncertainty, accounting for a sum total of about 77% of the error, rests in the spatial variation of aboveground carbon stocks in primary forests, secondary forests and on fallow land. The poor quality of and insufficient time interval between land-cover maps account for the remainder of the overall uncertainty. These findings are a show-stopper for REDD+ under prevailing science and technology conditions. The Pelletier et al study highlights the pressing need to improve the accuracy of forest carbon and land cover mapping assessments in order for REDD+ to become viable, but how can the uncertainties be overcome? First, with REDD+ nations required to report their emissions, and with verification organizations wanting to check on the reported numbers, there is a clear need for shared measurement and monitoring approaches. One of the major

  10. Frames of scientific evidence: How journalists represent the (un)certainty of molecular medicine in science television programs.

    PubMed

    Ruhrmann, Georg; Guenther, Lars; Kessler, Sabrina Heike; Milde, Jutta

    2015-08-01

    For laypeople, media coverage of science on television is a gateway to scientific issues. Defining scientific evidence is central to the field of science, but there are still questions if news coverage of science represents scientific research findings as certain or uncertain. The framing approach is a suitable framework to classify different media representations; it is applied here to investigate the frames of scientific evidence in film clips (n=207) taken from science television programs. Molecular medicine is the domain of interest for this analysis, due to its high proportion of uncertain and conflicting research findings and risks. The results indicate that television clips vary in their coverage of scientific evidence of molecular medicine. Four frames were found: Scientific Uncertainty and Controversy, Scientifically Certain Data, Everyday Medical Risks, and Conflicting Scientific Evidence. They differ in their way of framing scientific evidence and risks of molecular medicine. © The Author(s) 2013.

  11. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  12. Uncertainty as Impetus for Climate Mitigation

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Oreskes, N.; Risbey, J.

    2015-12-01

    For decades, the scientific community has called for actions to be taken to mitigate the adverse consequences of climate change. To date, those calls have found little substantial traction, and politicians and the general public are instead engaged in a debate about the causes and effects of climate change that bears little resemblance to the state of scientific knowledge. Uncertainty plays a pivotal role in that public debate, and arguments against mitigation are frequently couched in terms of uncertainty. We show that the rhetorical uses of scientific uncertainty in public debate by some actors (often with vested interests or political agendas) contrast with the mathematical result that greater uncertainty about the extent of warming is virtually always associated with an increased risk: The expected damage costs increase as a function of uncertainty about future warming. We suggest ways in which the actual implications of scientific uncertainty can be better communicated and how scientific uncertainty should be understood as an impetus, rather than a barrier, for climate mitigation.

  13. Implementation of Scientific Community Laboratories and Their Effect on Student Conceptual Learning, Attitudes, and Understanding of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lark, Adam

    Scientific Community Laboratories, developed by The University of Maryland, have shown initial promise as laboratories meant to emulate the practice of doing physics. These laboratories have been re-created by incorporating their design elements with the University of Toledo course structure and resources. The laboratories have been titled the Scientific Learning Community (SLC) Laboratories. A comparative study between these SLC laboratories and the University of Toledo physics department's traditional laboratories was executed during the fall 2012 semester on first semester calculus-based physics students. Three tests were executed as pre-test and post-tests to capture the change in students' concept knowledge, attitudes, and understanding of uncertainty. The Force Concept Inventory (FCI) was used to evaluate students' conceptual changes through the semester and average normalized gains were compared between both traditional and SLC laboratories. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) was conducted to elucidate students' change in attitudes through the course of each laboratory. Finally, interviews regarding data analysis and uncertainty were transcribed and coded to track changes in the way students understand uncertainty and data analysis in experimental physics after their participation in both laboratory type. Students in the SLC laboratories showed a notable an increase conceptual knowledge and attitudes when compared to traditional laboratories. SLC students' understanding of uncertainty showed most improvement, diverging completely from students in the traditional laboratories, who declined throughout the semester.

  14. Commentary: ambiguity and uncertainty: neglected elements of medical education curricula?

    PubMed

    Luther, Vera P; Crandall, Sonia J

    2011-07-01

    Despite significant advances in scientific knowledge and technology, ambiguity and uncertainty are still intrinsic aspects of contemporary medicine. To practice confidently and competently, a physician must learn rational approaches to complex and ambiguous clinical scenarios and must possess a certain degree of tolerance of ambiguity. In this commentary, the authors discuss the role that ambiguity and uncertainty play in medicine and emphasize why openly addressing these topics in the formal medical education curriculum is critical. They discuss key points from original research by Wayne and colleagues and their implications for medical education. Finally, the authors offer recommendations for increasing medical student tolerance of ambiguity and uncertainty, including dedicating time to attend candidly to ambiguity and uncertainty as a formal part of every medical school curriculum.

  15. Quantifying and managing uncertainty in operational modal analysis

    NASA Astrophysics Data System (ADS)

    Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.

    2018-03-01

    Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.

  16. Scientifically defensible fish conservation and recovery plans: Addressing diffuse threats and developing rigorous adaptive management plans

    USGS Publications Warehouse

    Maas-Hebner, Kathleen G.; Schreck, Carl B.; Hughes, Robert M.; Yeakley, Alan; Molina, Nancy

    2016-01-01

    We discuss the importance of addressing diffuse threats to long-term species and habitat viability in fish conservation and recovery planning. In the Pacific Northwest, USA, salmonid management plans have typically focused on degraded freshwater habitat, dams, fish passage, harvest rates, and hatchery releases. However, such plans inadequately address threats related to human population and economic growth, intra- and interspecific competition, and changes in climate, ocean, and estuarine conditions. Based on reviews conducted on eight conservation and/or recovery plans, we found that though threats resulting from such changes are difficult to model and/or predict, they are especially important for wide-ranging diadromous species. Adaptive management is also a critical but often inadequately constructed component of those plans. Adaptive management should be designed to respond to evolving knowledge about the fish and their supporting ecosystems; if done properly, it should help improve conservation efforts by decreasing uncertainty regarding known and diffuse threats. We conclude with a general call for environmental managers and planners to reinvigorate the adaptive management process in future management plans, including more explicitly identifying critical uncertainties, implementing monitoring programs to reduce those uncertainties, and explicitly stating what management actions will occur when pre-identified trigger points are reached.

  17. Addressing forecast uncertainty impact on CSP annual performance

    NASA Astrophysics Data System (ADS)

    Ferretti, Fabio; Hogendijk, Christopher; Aga, Vipluv; Ehrsam, Andreas

    2017-06-01

    This work analyzes the impact of weather forecast uncertainty on the annual performance of a Concentrated Solar Power (CSP) plant. Forecast time series has been produced by a commercial forecast provider using the technique of hindcasting for the full year 2011 in hourly resolution for Ouarzazate, Morocco. Impact of forecast uncertainty has been measured on three case studies, representing typical tariff schemes observed in recent CSP projects plus a spot market price scenario. The analysis has been carried out using an annual performance model and a standard dispatch optimization algorithm based on dynamic programming. The dispatch optimizer has been demonstrated to be a key requisite to maximize the annual revenues depending on the price scenario, harvesting the maximum potential out of the CSP plant. Forecasting uncertainty affects the revenue enhancement outcome of a dispatch optimizer depending on the error level and the price function. Results show that forecasting accuracy of direct solar irradiance (DNI) is important to make best use of an optimized dispatch but also that a higher number of calculation updates can partially compensate this uncertainty. Improvement in revenues can be significant depending on the price profile and the optimal operation strategy. Pathways to achieve better performance are presented by having more updates both by repeatedly generating new optimized trajectories but also more often updating weather forecasts. This study shows the importance of working on DNI weather forecasting for revenue enhancement as well as selecting weather services that can provide multiple updates a day and probabilistic forecast information.

  18. Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.

    PubMed

    Djulbegovic, Benjamin

    2011-10-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.

  19. Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics

    PubMed Central

    Djulbegovic, Benjamin

    2011-01-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885

  20. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  1. A framework to build scientific confidence in read across results. (SOT CE course presentation)

    EPA Science Inventory

    Read-across acceptance is remains a major hurdle primarily due to the lack of objectivity and clarity on how to practically address uncertainties. One avenue that can be exploited to build scientific confidence in the development and evaluation of read-across is by taking advant...

  2. Application of fuzzy system theory in addressing the presence of uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statisticalmore » approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.« less

  3. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  4. A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty

    NASA Astrophysics Data System (ADS)

    Vezer, M. A.

    2009-12-01

    The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of

  5. Ethical challenges in FASD prevention: Scientific uncertainty, stigma, and respect for women's autonomy.

    PubMed

    Zizzo, Natalie; Racine, Eric

    2017-11-09

    Fetal alcohol spectrum disorder (FASD) is a leading form of neurodevelopmental delay in Canada, affecting an estimated 3000 babies per year. FASD involves a range of disabilities that entail significant costs to affected individuals, families, and society. Exposure to alcohol in utero is a necessary factor for FASD development, and this has led to FASD being described as "completely preventable". However, there are significant ethical challenges associated with FASD prevention. These challenges revolve around 1) what should be communicated about the risks of alcohol consumption during pregnancy, given some ongoing scientific uncertainty about the effects of prenatal alcohol exposure, and 2) how to communicate these risks, given the potential for stigma against women who give birth to children with FASD as well as against children and adults with FASD. In this paper, we share initial thoughts on how primary care physicians can tackle this complex challenge. First, we recommend honest disclosure of scientific evidence to women and the tailoring of information offered to pregnant women. Second, we propose a contextualized, patient-centred, compassionate approach to ensure that appropriate advice is given to patients in a supportive, non-stigmatizing way.

  6. Climate change risk perception and communication: addressing a critical moment?

    PubMed

    Pidgeon, Nick

    2012-06-01

    Climate change is an increasingly salient issue for societies and policy-makers worldwide. It now raises fundamental interdisciplinary issues of risk and uncertainty analysis and communication. The growing scientific consensus over the anthropogenic causes of climate change appears to sit at odds with the increasing use of risk discourses in policy: for example, to aid in climate adaptation decision making. All of this points to a need for a fundamental revision of our conceptualization of what it is to do climate risk communication. This Special Collection comprises seven papers stimulated by a workshop on "Climate Risk Perceptions and Communication" held at Cumberland Lodge Windsor in 2010. Topics addressed include climate uncertainties, images and the media, communication and public engagement, uncertainty transfer in climate communication, the role of emotions, localization of hazard impacts, and longitudinal analyses of climate perceptions. Climate change risk perceptions and communication work is critical for future climate policy and decisions. © 2012 Society for Risk Analysis.

  7. Introduction to the Special Issue on Climate Ethics: Uncertainty, Values and Policy.

    PubMed

    Roeser, Sabine

    2017-10-01

    Climate change is a pressing phenomenon with huge potential ethical, legal and social policy implications. Climate change gives rise to intricate moral and policy issues as it involves contested science, uncertainty and risk. In order to come to scientifically and morally justified, as well as feasible, policies, targeting climate change requires an interdisciplinary approach. This special issue will identify the main challenges that climate change poses from social, economic, methodological and ethical perspectives by focusing on the complex interrelations between uncertainty, values and policy in this context. This special issue brings together scholars from economics, social sciences and philosophy in order to address these challenges.

  8. ICYESS 2013: Understanding and Interpreting Uncertainty

    NASA Astrophysics Data System (ADS)

    Rauser, F.; Niederdrenk, L.; Schemann, V.; Schmidt, A.; Suesser, D.; Sonntag, S.

    2013-12-01

    We will report the outcomes and highlights of the Interdisciplinary Conference of Young Earth System Scientists (ICYESS) on Understanding and Interpreting Uncertainty in September 2013, Hamburg, Germany. This conference is aimed at early career scientists (Masters to Postdocs) from a large variety of scientific disciplines and backgrounds (natural, social and political sciences) and will enable 3 days of discussions on a variety of uncertainty-related aspects: 1) How do we deal with implicit and explicit uncertainty in our daily scientific work? What is uncertain for us, and for which reasons? 2) How can we communicate these uncertainties to other disciplines? E.g., is uncertainty in cloud parameterization and respectively equilibrium climate sensitivity a concept that is understood equally well in natural and social sciences that deal with Earth System questions? Or vice versa, is, e.g., normative uncertainty as in choosing a discount rate relevant for natural scientists? How can those uncertainties be reconciled? 3) How can science communicate this uncertainty to the public? Is it useful at all? How are the different possible measures of uncertainty understood in different realms of public discourse? Basically, we want to learn from all disciplines that work together in the broad Earth System Science community how to understand and interpret uncertainty - and then transfer this understanding to the problem of how to communicate with the public, or its different layers / agents. ICYESS is structured in a way that participation is only possible via presentation, so every participant will give their own professional input into how the respective disciplines deal with uncertainty. Additionally, a large focus is put onto communication techniques; there are no 'standard presentations' in ICYESS. Keynote lectures by renowned scientists and discussions will lead to a deeper interdisciplinary understanding of what we do not really know, and how to deal with it. Many

  9. The ends of uncertainty: Air quality science and planning in Central California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fine, James

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently

  10. Are models, uncertainty, and dispute resolution compatible?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  11. Educated Guesses and Other Ways to Address the Pharmacological Uncertainty of Designer Drugs

    PubMed Central

    Berning, Moritz

    2016-01-01

    This study examines how experimentation with designer drugs is mediated by the Internet. We selected a popular drug forum that presents reports on self-experimentation with little or even completely unexplored designer drugs to examine: (1) how participants report their “trying out” of new compounds and (2) how participants reduce the pharmacological uncertainty associated with using these substances. Our methods included passive observation online, engaging more actively with the online community using an avatar, and off-line interviews with key interlocutors to validate our online findings. This article reflects on how forum participants experiment with designer drugs, their trust in suppliers and the testimonials of others, the use of ethno-scientific techniques that involve numerical weighing, “allergy dosing,” and the use of standardized trip reports. We suggest that these techniques contribute to a sense of control in the face of the possible toxicity of unknown or little-known designer drugs. The online reporting of effects allows users to experience not only the thrill of a new kind of high but also connection with others in the self-experimenting drug community. PMID:27721526

  12. Rethinking Uncertainty: What Does the Public Need to Know?

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    The late Steven Schneider is often quoted as addressing the double-bind of science communication: that to be a good scientist one has to be cautious and acknowledge uncertainty, but to reach the media and the public one has to be bold, incautious, and even a bit dramatic. Here, I focus on a related but different double-bind: the double bind of responding to doubt. In our recent book, Merchants of Doubt, Erik M. Conway and I showed how doubt-mongers exploited scientific uncertainty as a political strategy to confuse the public and delay action on a range of environmental issues from the harms of tobacco to the reality of anthropogenic climate change. This strategy is effective because it appeals to lay people, journalists,' and even fellow scientists' sense of fair play—that it is right to hear "both sides" of an issue. Scientists are then caught in a double-bind: refusing to respond seems smug and elitist, but responding scientifically seems to confirm that there is in fact a scientific debate. Doubt-mongering is also hard to counter because our knowledge is, in fact, uncertain, so when we communicate in conventional scientific ways, acknowledging the uncertainties and limits in our understanding, we may end up reinforcing the uncertainty framework. The difficulty is exacerbated by the natural tendency of scientists to focus on novel and original results, rather than matters that are well established, lest we be accused of lacking originality or of taking credit for other's work. The net result is the impression among lay people that our knowledge is very likely to change and therefore a weak basis for making public policy decision. History of science, however, suggests a different picture: we know that a good deal of scientific knowledge has proved temporally robust and has provided a firm basis for effective public policy. Action on earlier environmental issues such as DDT and acid rain, guided by scientific knowledge, has worked to limit environmental damage

  13. Uncertainty As a Trigger for a Paradigm Change in Science Communication

    NASA Astrophysics Data System (ADS)

    Schneider, S.

    2014-12-01

    Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.

  14. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.

  15. Communicating uncertainties in assessments of future sea level rise

    NASA Astrophysics Data System (ADS)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing

  16. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  17. Towards a common oil spill risk assessment framework – Adapting ISO 31000 and addressing uncertainties.

    PubMed

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio; Janeiro, Joao; Samaras, Achilleas; Zodiatis, George; De Dominicis, Michela

    2015-08-15

    Oil spills are a transnational problem, and establishing a common standard methodology for Oil Spill Risk Assessments (OSRAs) is thus paramount in order to protect marine environments and coastal communities. In this study we firstly identified the strengths and weaknesses of the OSRAs carried out in various parts of the globe. We then searched for a generic and recognized standard, i.e. ISO 31000, in order to design a method to perform OSRAs in a scientific and standard way. The new framework was tested for the Lebanon oil spill that occurred in 2006 employing ensemble oil spill modeling to quantify the risks and uncertainties due to unknown spill characteristics. The application of the framework generated valuable visual instruments for the transparent communication of the risks, replacing the use of risk tolerance levels, and thus highlighting the priority areas to protect in case of an oil spill. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Exploring uncertainty in the Earth Sciences - the potential field perspective

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Blakely, R. J.

    2013-12-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  19. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  20. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    PubMed

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  1. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that

  2. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  3. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  4. Adaptation Planning for Water Resources Management in the Context of Scientific Uncertainty

    NASA Astrophysics Data System (ADS)

    Lowrey, J.; Kenney, D.

    2008-12-01

    Several municipalities are beginning to create policies and plans in order to adapt to potential impacts from climate change. A 2007 report from the Heinz Center for Science, Economics, and the Environment, 'A Survey of Climate Change Adaptation Planning,' surveyed fourteen cities or counties across the U.S. and Canada that have created or are working towards creating climate change adaptation plans. Informal interactions with water managers in the Intermountain West indicate an eagerness to learn from those who have already begun adapting to potential climate change. Many of those without plans do not feel comfortable making potentially expensive long-term policy decisions based on impacts derived from uncertain climate change projections. This research identifies how decision makers currently consider climate change in adaptation planning despite imperfect information about climate change impacts, particularly in the water sector. Insights are offered into how best to provide information on climate change projections to regional decision makers so that they can begin adaptation planning for a changing climate. This research analyzes how a subset of the fourteen municipalities justified adaptive planning in the face of scientific uncertainty, paying particular attention to water resource adaptation, using the adaptation approaches studied in the 2007 Heinz Center Report. Interviews will be conducted with decision makers to learn how policies will be implemented and evaluated, and to explore resulting changes in policy or planning. Adaptation strategies are not assessed, but are used to identify how the decision makers plan to evaluate their own adaptation policies. In addition to looking at information use in adaptation plans, we compare how the plans orient themselves (adapting to projected impacts vs. increasing resiliency to current climate variability), how they address barriers and opportunities for adaptation, and whether they follow some key steps for

  5. History Forum Addresses Creation/Evolution Controversy.

    ERIC Educational Resources Information Center

    Schweinsberg, John

    1997-01-01

    A series of programs entitled Creationism and Evolution: The History of a Controversy was presented at the University of Alabama in Huntsville. The controversy was addressed from an historical and sociological, rather than a scientific perspective. Speakers addressed the evolution of scientific creationism, ancient texts versus sedimentary rocks…

  6. On different types of uncertainties in the context of the precautionary principle.

    PubMed

    Aven, Terje

    2011-10-01

    Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.

  7. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  8. Assessing and Addressing Students' Scientific Literacy Needs in Physical Geology

    NASA Astrophysics Data System (ADS)

    Campbell-Stone, E. A.; Myers, J. D.

    2005-12-01

    Exacting excellence equally from university students around the globe can be accomplished by providing all students with necessary background tools to achieve mastery of their courses, even if those tools are not part of normal content. As instructors we hope to see our students grasp the substance of our courses, make mental connections between course material and practical applications, and use this knowledge to make informed decisions as citizens. Yet many educators have found that students enter university-level introductory courses in mathematics, science and engineering without adequate academic preparation. As part of a FIPSE-funded project at the University of Wyoming, the instructors of the Physical Geology course have taken a new approach to tackling the problem of lack of scientific/mathematic skills in incoming students. Instead of assuming that students should already know or will learn these skills on their own, they assess students' needs and provide them the opportunity to master scientific literacies as they learn geologic content. In the introductory geology course, instructors identified two categories of literacies, or basic skills that are necessary for academic success and citizen participation. Fundamental literacies include performing simple quantitative calculations, making qualitative assessments, and reading and analyzing tables and graphs. Technical literacies are those specific to understanding geology, and comprise the ability to read maps, visualize changes through time, and conceptualize in three dimensions. Because these skills are most easily taught in lab, the in-house lab manual was rewritten to be both literacy- and content-based. Early labs include simple exercises addressing literacies in the context of geological science, and each subsequent lab repeats exposure to literacies, but at increasing levels of difficulty. Resources available to assist students with literacy mastery include individual instruction, a detailed

  9. A Reliability Comparison of Classical and Stochastic Thickness Margin Approaches to Address Material Property Uncertainties for the Orion Heat Shield

    NASA Technical Reports Server (NTRS)

    Sepka, Steven A.; McGuire, Mary Kathleen; Vander Kam, Jeremy C.

    2018-01-01

    The Orion Thermal Protection System (TPS) margin process uses a root-sum-square approach with branches addressing trajectory, aerothermodynamics, and material response uncertainties in ablator thickness design. The material response branch applies a bondline temperature reduction between the Avcoat ablator and EA9394 adhesive by 60 C (108 F) from its peak allowed value of 260 C (500 F). This process is known as the Bond Line Temperature Material Margin (BTMM) and is intended to cover material property and performance uncertainties. The value of 60 C (108 F) is a constant, applied at any spacecraft body location and for any trajectory. By varying only material properties in a random (monte carlo) manner, the perl-based script mcCHAR is used to investigate the confidence interval provided by the BTMM. In particular, this study will look at various locations on the Orion heat shield forebody for a guided and an abort (ballistic) trajectory.

  10. A Reliability Comparison of Classical and Stochastic Thickness Margin Approaches to Address Material Property Uncertainties for the Orion Heat Shield

    NASA Technical Reports Server (NTRS)

    Sepka, Steve; Vander Kam, Jeremy; McGuire, Kathy

    2018-01-01

    The Orion Thermal Protection System (TPS) margin process uses a root-sum-square approach with branches addressing trajectory, aerothermodynamics, and material response uncertainties in ablator thickness design. The material response branch applies a bond line temperature reduction between the Avcoat ablator and EA9394 adhesive by 60 C (108 F) from its peak allowed value of 260 C (500 F). This process is known as the Bond Line Temperature Material Margin (BTMM) and is intended to cover material property and performance uncertainties. The value of 60 C (108 F) is a constant, applied at any spacecraft body location and for any trajectory. By varying only material properties in a random (monte carlo) manner, the perl-based script mcCHAR is used to investigate the confidence interval provided by the BTMM. In particular, this study will look at various locations on the Orion heat shield forebody for a guided and an abort (ballistic) trajectory.

  11. Scientific networking to address the causes, timing, emplacement mechanisms, and consequences of the Messinian Salinity Crisis

    NASA Astrophysics Data System (ADS)

    Camerlenghi, Angelo; Lofi, Johanna; Aloisi, Vanni; Flecker, Rachel

    2017-04-01

    The origin of the Mediterranean salt giant is linked to an extraordinary event in the geological history of the Mediterranean region, commonly referred to as the Messinian Salinity Crisis (MSC). After 45 years of intense yet disunited research efforts, the international scientific community at large faces a unique opportunity to access the deep and marginal basins Messinian depositional successions in the Mediterranean through scientific drilling, namely through the Integrated Ocean Discovery Program (IODP) and the International Continental Drilling Program (ICDP). Scientific activity to promote scientific drilling offshore and onshore is in progress under the broad umbrella of the Uncovering a Salt Giant' IODP Multi-Platform Drilling proposal, that has generated the Deep-Sea Records of the Messinian Salinity Crisis (DREAM) site-specific pre-proposal for riserless drilling on Messinian marginal basins and the related ICDP-IODP amphibious initiative Investigating Miocene Mediterranean- Atlantic gateway exchange (IMMAGE). Scientific networking has begun to establish a broad cross-disciplinary research community embracing geology, geophysics, geochemistry, microbiology, and paleoclimatology. Formal networking activities represent an opportunity for the scientific community to share objectives, data, expertise and tools with industry since there is considerable interest in oil and gas exploration, and consequent hazards, targeting the Mediterranean's deep salt deposits. With the acronym MEDSALT, we have established two networks working in close cooperation: (1) COST Action CA15103 Uncovering the Mediterranean salt giant (MEDSALT) (https://medsalt.eu/) is a 4-year long network established in May 2016 comprising scientific institutions from 28 states. This COST Action will provide an opportunity to develop further our knowledge of salt rock formation addressing four overarching scientific questions: a) What are the causes, timing and emplacement mechanisms of the

  12. Possible Role of Green Chemistry in Addressing Environmenal Plastic Debris: Scientific, Economic and Policy Issues

    NASA Astrophysics Data System (ADS)

    Bayha, K. M.

    2016-02-01

    Plastics have revolutionized modern life, replacing other raw materials in a vast array of products, due to their ease in molding and shaping, as well as superior recalcitrance to wearing and aging. However, this functional benefit makes plastic one of the most problematic pollutants, since they accumulate as environmental debris for decades and possibly for centuries. Rightfully so, programs addressing plastic debris typically involve efforts to reduce consumption, reuse plastic products and recycle them when usefulness is complete. However, some of these options can be problematic for certain applications, as well as in countries that lack efficient municipal solid waste or recycling facilities. The principles of Green Chemistry were developed to help scientists design chemical products that reduce or eliminate the use or generation of hazardous substances. These principles have also been applied to developing sustainable or greener polymers for use in consumer plastics. For instance, the EPA's Green Chemistry Program awards the Presidential Green Chemistry Challenge Awards each year, with a large percentage of awards having gone to developments in greener polymers. Many of these advancements involve the development of sustainable bio-based, more degradable or more recyclable polymers that deliver significant environmental benefits. This presentation is meant to address what role the development of truly greener polymers might have in addressing environmental plastic debris in parallel with efforts to reduce, reuse and recycle. The intention is to evaluate the issues posed by traditional polymer types, address the ultimate goals of alternative polymer development and evaluate research on current alternative polymer technologies, in order to objectively assess their usefulness in addressing environmental plastic debris accumulation. In addition, the scientific, policy and market issues that may be impeding accurate development, evaluation and implementation of

  13. A review of uncertainty research in impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  14. Nature of Science, Scientific Inquiry, and Socio-Scientific Issues Arising from Genetics: A Pathway to Developing a Scientifically Literate Citizenry

    NASA Astrophysics Data System (ADS)

    Lederman, Norman G.; Antink, Allison; Bartos, Stephen

    2014-02-01

    The primary focus of this article is to illustrate how teachers can use contemporary socio-scientific issues to teach students about nature of scientific knowledge as well as address the science subject matter embedded in the issues. The article provides an initial discussion about the various aspects of nature of scientific knowledge that are addressed. It is important to remember that the aspects of nature of scientific knowledge are not considered to be a comprehensive list, but rather a set of important ideas for adolescent students to learn about scientific knowledge. These ideas have been advocated as important for secondary students by numerous reform documents internationally. Then, several examples are used to illustrate how genetically based socio-scientific issues can be used by teachers to improve students' understandings of the discussed aspects of nature of scientific knowledge.

  15. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  16. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  17. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  18. Using Websites to Convey Scientific Uncertainties for Volcanic Processes and Potential Hazards

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Lowenstern, J. B.; Hill, D. P.

    2005-12-01

    The Yellowstone Volcano Observatory (YVO) and Long Valley Observatory (LVO) websites have greatly increased the public's awareness and access to information about scientific uncertainties for volcanic processes by communicating at multiple levels of understanding and varied levels of detail. Our websites serve a broad audience ranging from visitors unaware of the calderas, to lay volcano enthusiasts, to scientists, federal agencies, and emergency managers. Both Yellowstone and Long Valley are highly visited tourist attractions with histories of caldera-forming eruptions large enough to alter global climate temporarily. Although it is much more likely that future activity would be on a small scale at either volcano, we are constantly posed questions about low-probability, high-impact events such as the caldera-forming eruption depicted in the recent BBC/Discovery movie, "Supervolcano". YVO and LVO website objectives include: providing monitoring data, explaining the likelihood of future events, summarizing research results, helping media provide reliable information, and expanding on information presented by the media. Providing detailed current information is a crucial website component as the public often searches online to augment information gained from often cryptic pronouncements by the media. In May 2005, for example, YVO saw an order of magnitude increase in page requests on the day MSNBC ran the misleading headline, "Yellowstone eruption threat high." The headline referred not to current events but a general rating of Yellowstone as one of 37 "high threat" volcanoes in the USGS National Volcano Early Warning System report. As websites become a more dominant source of information, we continuously revise our communication plans to make the most of this evolving medium. Because the internet gives equal access to all information providers, we find ourselves competing with various "doomsday" websites that sensationalize and distort the current understanding of

  19. Trapped between two tails: trading off scientific uncertainties via climate targets

    NASA Astrophysics Data System (ADS)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  20. Trapped Between Two Tails: Trading Off Scientific Uncertainties via Climate Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemoine, Derek M.; McJeon, Haewon C.

    2013-08-20

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology- rich GCAM integrated assessment model to assess the robustness of 450 ppm and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides netmore » benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.« less

  1. Addressing contrasting cognitive models in scientific collaboration

    NASA Astrophysics Data System (ADS)

    Diviacco, P.

    2012-04-01

    If the social aspects of scientific communities and their internal dynamics is starting to be recognized and acknowledged in the everyday lives of scientists, it is rather difficult for them to find tools that could support their activities consistently with this perspective. Issues span from gathering researchers to mutual awareness, from information sharing to building meaning, with the last one being particularly critical in research fields as the geo-sciences, that deal with the reconstruction of unique, often non-reproducible, and contingent processes. Reasoning here is, in fact, mainly abductive, allowing multiple and concurrent explanations for the same phenomenon to coexist. Scientists bias one hypothesis over another not only on strictly logical but also on sociological motivations. Following a vision, scientists tend to evolve and isolate themselves from other scientists creating communities characterized by different cognitive models, so that after some time these become incompatible and scientists stop understanding each other. We address these problems as a communication issue so that the classic distinction into three levels (syntactic, semantic and pragmatic) can be used. At the syntactic level, we highlight non-technical obstacles that condition interoperability and data availability and transparency. At the semantic level, possible incompatibilities of cognitive models are particularly evident, so that using ontologies, cross-domain reconciliation should be applied. This is a very difficult task to perform since the projection of knowledge by scientists, in the designated community, is political and thus can create a lot of tension. The strategy we propose to overcome these issues pertains to pragmatics, in the sense that it is intended to acknowledge the cultural and personal factors each partner brings into the collaboration and is based on the idea that meaning should remain a flexible and contingent representation of possibly divergent views

  2. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2008-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1

  4. Constructing (un-)certainty: An exploration of journalistic decision-making in the reporting of neuroscience.

    PubMed

    Lehmkuhl, Markus; Peters, Hans Peter

    2016-11-01

    Based on 21 individual case studies, this article inventories the ways journalism deals with scientific uncertainty. The study identifies the decisions that impact a journalist's perception of a truth claim as unambiguous or ambiguous and the strategies to deal with uncertainty that arise from this perception. Key for understanding journalistic action is the outcome of three evaluations: What is the story about? How shall the story be told? What type of story is it? We reconstructed the strategies to overcome journalistic decision-making uncertainty in those cases in which they perceived scientific contingency as a problem. Journalism deals with uncertainty by way of omission, by contrasting the conflicting messages or by acknowledging the problem via the structure or language. One finding deserves particular mention: The lack of focus on scientific uncertainty is not only a problem of how journalists perceive and communicate but also a problem of how science communicates. © The Author(s) 2016.

  5. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  6. Uncertainty and risk in wildland fire management: A review

    Treesearch

    Matthew P. Thompson; Dave E. Calkin

    2011-01-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to...

  7. An Extended Chemical Plant Environmental Protection Game on Addressing Uncertainties of Human Adversaries.

    PubMed

    Zhu, Zhengqiu; Chen, Bin; Qiu, Sihang; Wang, Rongxiao; Chen, Feiran; Wang, Yiping; Qiu, Xiaogang

    2018-03-27

    Chemical production activities in industrial districts pose great threats to the surrounding atmospheric environment and human health. Therefore, developing appropriate and intelligent pollution controlling strategies for the management team to monitor chemical production processes is significantly essential in a chemical industrial district. The literature shows that playing a chemical plant environmental protection (CPEP) game can force the chemical plants to be more compliant with environmental protection authorities and reduce the potential risks of hazardous gas dispersion accidents. However, results of the current literature strictly rely on several perfect assumptions which rarely hold in real-world domains, especially when dealing with human adversaries. To address bounded rationality and limited observability in human cognition, the CPEP game is extended to generate robust schedules of inspection resources for inspection agencies. The present paper is innovative on the following contributions: (i) The CPEP model is extended by taking observation frequency and observation cost of adversaries into account, and thus better reflects the industrial reality; (ii) Uncertainties such as attackers with bounded rationality, attackers with limited observation and incomplete information (i.e., the attacker's parameters) are integrated into the extended CPEP model; (iii) Learning curve theory is employed to determine the attacker's observability in the game solver. Results in the case study imply that this work improves the decision-making process for environmental protection authorities in practical fields by bringing more rewards to the inspection agencies and by acquiring more compliance from chemical plants.

  8. An Extended Chemical Plant Environmental Protection Game on Addressing Uncertainties of Human Adversaries

    PubMed Central

    Wang, Rongxiao; Chen, Feiran; Wang, Yiping; Qiu, Xiaogang

    2018-01-01

    Chemical production activities in industrial districts pose great threats to the surrounding atmospheric environment and human health. Therefore, developing appropriate and intelligent pollution controlling strategies for the management team to monitor chemical production processes is significantly essential in a chemical industrial district. The literature shows that playing a chemical plant environmental protection (CPEP) game can force the chemical plants to be more compliant with environmental protection authorities and reduce the potential risks of hazardous gas dispersion accidents. However, results of the current literature strictly rely on several perfect assumptions which rarely hold in real-world domains, especially when dealing with human adversaries. To address bounded rationality and limited observability in human cognition, the CPEP game is extended to generate robust schedules of inspection resources for inspection agencies. The present paper is innovative on the following contributions: (i) The CPEP model is extended by taking observation frequency and observation cost of adversaries into account, and thus better reflects the industrial reality; (ii) Uncertainties such as attackers with bounded rationality, attackers with limited observation and incomplete information (i.e., the attacker’s parameters) are integrated into the extended CPEP model; (iii) Learning curve theory is employed to determine the attacker’s observability in the game solver. Results in the case study imply that this work improves the decision-making process for environmental protection authorities in practical fields by bringing more rewards to the inspection agencies and by acquiring more compliance from chemical plants. PMID:29584679

  9. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    PubMed

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  10. Believable Statements of Uncertainty and Believable Science

    PubMed Central

    Lindstrom, Richard M.

    2017-01-01

    Nearly fifty years ago, two landmark papers appeared that should have cured the problem of ambiguous uncertainty statements in published data. Eisenhart’s paper in Science called for statistically meaningful numbers, and Currie’s Analytical Chemistry paper revealed the wide range in common definitions of detection limit. Confusion and worse can result when uncertainties are misinterpreted or ignored. The recent stories of cold fusion, variable radioactive decay, and piezonuclear reactions provide cautionary examples in which prior probability has been neglected. We show examples from our laboratory and others to illustrate the fact that uncertainty depends on both statistical and scientific judgment. PMID:28584391

  11. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    NASA Astrophysics Data System (ADS)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.

  12. Addressing Criticisms of Large-Scale Marine Protected Areas.

    PubMed

    O'Leary, Bethan C; Ban, Natalie C; Fernandez, Miriam; Friedlander, Alan M; García-Borboroglu, Pablo; Golbuu, Yimnang; Guidetti, Paolo; Harris, Jean M; Hawkins, Julie P; Langlois, Tim; McCauley, Douglas J; Pikitch, Ellen K; Richmond, Robert H; Roberts, Callum M

    2018-05-01

    Designated large-scale marine protected areas (LSMPAs, 100,000 or more square kilometers) constitute over two-thirds of the approximately 6.6% of the ocean and approximately 14.5% of the exclusive economic zones within marine protected areas. Although LSMPAs have received support among scientists and conservation bodies for wilderness protection, regional ecological connectivity, and improving resilience to climate change, there are also concerns. We identified 10 common criticisms of LSMPAs along three themes: (1) placement, governance, and management; (2) political expediency; and (3) social-ecological value and cost. Through critical evaluation of scientific evidence, we discuss the value, achievements, challenges, and potential of LSMPAs in these arenas. We conclude that although some criticisms are valid and need addressing, none pertain exclusively to LSMPAs, and many involve challenges ubiquitous in management. We argue that LSMPAs are an important component of a diversified management portfolio that tempers potential losses, hedges against uncertainty, and enhances the probability of achieving sustainably managed oceans.

  13. Addressing Criticisms of Large-Scale Marine Protected Areas

    PubMed Central

    Ban, Natalie C; Fernandez, Miriam; Friedlander, Alan M; García-Borboroglu, Pablo; Golbuu, Yimnang; Guidetti, Paolo; Harris, Jean M; Hawkins, Julie P; Langlois, Tim; McCauley, Douglas J; Pikitch, Ellen K; Richmond, Robert H; Roberts, Callum M

    2018-01-01

    Abstract Designated large-scale marine protected areas (LSMPAs, 100,000 or more square kilometers) constitute over two-thirds of the approximately 6.6% of the ocean and approximately 14.5% of the exclusive economic zones within marine protected areas. Although LSMPAs have received support among scientists and conservation bodies for wilderness protection, regional ecological connectivity, and improving resilience to climate change, there are also concerns. We identified 10 common criticisms of LSMPAs along three themes: (1) placement, governance, and management; (2) political expediency; and (3) social–ecological value and cost. Through critical evaluation of scientific evidence, we discuss the value, achievements, challenges, and potential of LSMPAs in these arenas. We conclude that although some criticisms are valid and need addressing, none pertain exclusively to LSMPAs, and many involve challenges ubiquitous in management. We argue that LSMPAs are an important component of a diversified management portfolio that tempers potential losses, hedges against uncertainty, and enhances the probability of achieving sustainably managed oceans. PMID:29731514

  14. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  15. Adapting to Uncertainty: Comparing Methodological Approaches to Climate Adaptation and Mitigation Policy

    NASA Astrophysics Data System (ADS)

    Huda, J.; Kauneckis, D. L.

    2013-12-01

    Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.

  16. Nanomedicines: addressing the scientific and regulatory gap.

    PubMed

    Tinkle, Sally; McNeil, Scott E; Mühlebach, Stefan; Bawa, Raj; Borchard, Gerrit; Barenholz, Yechezkel Chezy; Tamarkin, Lawrence; Desai, Neil

    2014-04-01

    Nanomedicine is the application of nanotechnology to the discipline of medicine: the use of nanoscale materials for the diagnosis, monitoring, control, prevention, and treatment of disease. Nanomedicine holds tremendous promise to revolutionize medicine across disciplines and specialties, but this promise has yet to be fully realized. Beyond the typical complications associated with drug development, the fundamentally different and novel physical and chemical properties of some nanomaterials compared to materials on a larger scale (i.e., their bulk counterparts) can create a unique set of opportunities as well as safety concerns, which have only begun to be explored. As the research community continues to investigate nanomedicines, their efficacy, and the associated safety issues, it is critical to work to close the scientific and regulatory gaps to assure that nanomedicine drives the next generation of biomedical innovation. © 2014 New York Academy of Sciences.

  17. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  18. Nature of Science, Scientific Inquiry, and Socio-Scientific Issues Arising from Genetics: A Pathway to Developing a Scientifically Literate Citizenry

    ERIC Educational Resources Information Center

    Lederman, Norman G.; Antink, Allison; Bartos, Stephen

    2014-01-01

    The primary focus of this article is to illustrate how teachers can use contemporary socio-scientific issues to teach students about nature of scientific knowledge as well as address the science subject matter embedded in the issues. The article provides an initial discussion about the various aspects of nature of scientific knowledge that are…

  19. Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates

    EPA Science Inventory

    Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...

  20. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Díez, C.J., E-mail: cj.diez@upm.es; Cabellos, O.; Instituto de Fusión Nuclear, Universidad Politécnica de Madrid, 28006 Madrid

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has tomore » be performed in order to analyse the limitations of using one-group uncertainties.« less

  1. Consistency of nature of science views across scientific and socio-scientific contexts

    NASA Astrophysics Data System (ADS)

    Khishfe, Rola

    2017-03-01

    The purpose of the investigation was to investigate the consistency of NOS views among high school students across different scientific and socio-scientific contexts. A total of 261 high school students from eight different schools in Lebanon participated in the investigation. The schools were selected based on different geographical areas in Lebanon and the principals' consent to participate in the study. The investigation used a qualitative design to compare the responses of students across different contexts/topics. All the participants completed a five-item open-ended questionnaire, which includes five topics addressing scientific and socio-scientific contexts. The items of the questionnaire addressed the empirical, tentative, and subjective aspects of NOS. Quantitative and qualitative analyses were conducted to answer the research questions. Results showed that participants' views of the emphasised NOS aspects were mostly inconsistent. Plus, there was variance in participants' views of NOS between scientific and socio-scientific issues. Discussion of the results related to differential developmental progression, contextual factors, social constructivist perspective, different domains of knowledge, and students' individual differences.

  2. Cloud Feedbacks on Climate: A Challenging Scientific Problem

    ScienceCinema

    Norris, Joe

    2017-12-22

    One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.

  3. Meeting Materials for the December 4-6, 2013 Scientific Advisory Panel

    EPA Pesticide Factsheets

    Meeting Materials for the December 4-6, 2013 Scientific Advisory Panel on Scientific Uncertainties Associated with Corn Rootworm Resistance Monitoring for Bt Corn Plant Incorporated Protectants (PIPs)

  4. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  5. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  6. Scientific millenarianism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinberg, A.M.

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO{sub 2} warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are themore » questions addressed in this paper.« less

  7. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of

  8. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  9. The future of human embryonic stem cell research: addressing ethical conflict with responsible scientific research.

    PubMed

    Gilbert, David M

    2004-05-01

    Embryonic stem (ES) cells have almost unlimited regenerative capacity and can potentially generate any body tissue. Hence they hold great promise for the cure of degenerative human diseases. But their derivation and the potential for misuse have raised a number of ethical issues. These ethical issues threaten to paralyze pubic funding for ES cell research, leaving experimentation in the hands of the private sector and precluding the public's ability to monitor practices, research alternatives, and effectively address the very ethical issues that are cause for concern in the first place. With new technology being inevitable, and the potential for abuse high, government must stay involved if the public is to play a role in shaping the direction of research. In this essay, I will define levels of ethical conflict that can be delineated by the anticipated advances in technology. From the urgent need to derive new ES cell lines with existing technology, to the most far-reaching goal of deriving genetically identical tissues from an adult patients cells, technology-specific ethical dilemmas can be defined and addressed. This staged approach provides a solid ethical framework for moving forward with ES cell research. Moreover, by anticipating the moral conflicts to come, one can predict the types of scientific advances that could overcome these conflicts, and appropriately direct federal funding toward these goals to offset potentially less responsible research directives that will inevitably go forward via private or foreign funding.

  10. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  11. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information aboutmore » the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.« less

  12. Information theoretic quantification of diagnostic uncertainty.

    PubMed

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  13. Uncertainty prediction for PUB

    NASA Astrophysics Data System (ADS)

    Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.

    2003-04-01

    IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.

  14. Encouraging Uncertainty in the "Scientific Method": Promoting Understanding in the Processes of Science with Preservice Teachers

    ERIC Educational Resources Information Center

    Melville, Wayne; Bartley, Anthony; Fazio, Xavier

    2012-01-01

    Teachers' feelings of uncertainty are an overlooked, though crucial, condition necessary for the promotion of educational change. This article investigates the feelings of uncertainty that preservice teachers have toward the conduct of science as inquiry and the extent to which methods courses can confront and embrace those uncertainties. Our work…

  15. Sources Sought for Innovative Scientific Instrumentation for Scientific Lunar Rovers

    NASA Technical Reports Server (NTRS)

    Meyer, C.

    1993-01-01

    Lunar rovers should be designed as integrated scientific measurement systems that address scientific goals as their main objective. Scientific goals for lunar rovers are presented. Teleoperated robotic field geologists will allow the science team to make discoveries using a wide range of sensory data collected by electronic 'eyes' and sophisticated scientific instrumentation. rovers need to operate in geologically interesting terrain (rock outcrops) and to identify and closely examine interesting rock samples. Enough flight-ready instruments are available to fly on the first mission, but additional instrument development based on emerging technology is desirable. Various instruments that need to be developed for later missions are described.

  16. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix

  17. Communicating uncertainties in earth sciences in view of user needs

    NASA Astrophysics Data System (ADS)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to

  18. One Strategy for Reducing Uncertainty in Climate Change Communications

    NASA Astrophysics Data System (ADS)

    Romm, J.

    2011-12-01

    Future impacts of climate change are invariably presented with a very wide range of impacts reflecting two different sets of uncertainties. The first concerns our uncertainty about precisely how much greenhouse gas emissions humanity will emit into the atmosphere. The second concerns our uncertainty about precisely what impact those emissions will have on the climate. By failing to distinguish between these two types of uncertainties, climate scientists have not clearly explained to the public and policymakers what the scientific literature suggests is likely to happen if we don't substantially alter our current emissions path. Indeed, much of climate communications has been built around describing the range of impacts from emissions paths that are increasingly implausible given political and technological constraints, such as a stabilization at 450 or 550 parts per million atmospheric of carbon dioxide. For the past decade, human emissions of greenhouse gases have trended near the worst-case scenarios of the Intergovernmental Panel on Climate Change, emissions paths that reach 800 ppm or even 1000 ppm. The current policies of the two biggest emitters, the United States and China, coupled with the ongoing failure of international negotiations to come to an agreement on restricting emissions, suggests that recent trends will continue for the foreseeable future. This in turn suggests that greater clarity in climate change communications could be achieved by more clearly explaining to the public what the scientific literature suggests the range of impacts are for our current high emissions path. This also suggests that more focus should be given in the scientific literature to better constraining the range of impacts from the high emissions scenarios.

  19. Cross-cultural perspectives of scientific misconduct.

    PubMed

    Momen, Hooman; Gollogly, Laragh

    2007-09-01

    The increasing globalization of scientific research lends urgency to the need for international agreement on the concepts of scientific misconduct. Universal spiritual and moral principles on which ethical standards are generally based indicate that it is possible to reach international agreement on the ethical principles underlying good scientific practice. Concordance on an operational definition of scientific misconduct that would allow independent observers to agree which behaviour constitutes misconduct is more problematic. Defining scientific misconduct to be universally recognized and universally sanctioned means addressing the broader question of ensuring that research is not only well-designed - and addresses a real need for better evidence - but that it is ethically conducted in different cultures. An instrument is needed to ensure that uneven ethical standards do not create unnecessary obstacles to research, particularly in developing countries.

  20. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2017-01-30

    order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those

  1. Ensembles vs. information theory: supporting science under uncertainty

    NASA Astrophysics Data System (ADS)

    Nearing, Grey S.; Gupta, Hoshin V.

    2018-05-01

    Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.

  2. Science Teachers' Use of Mass Media to Address Socio-Scientific and Sustainability Issues

    ERIC Educational Resources Information Center

    Klosterman, Michelle L.; Sadler, Troy D.; Brown, Julie

    2012-01-01

    The currency, relevancy and changing nature of science makes it a natural topic of focus for mass media outlets. Science teachers and students can capitalize on this wealth of scientific information to explore socio-scientific and sustainability issues; however, without a lens on how those media are created and how representations of science are…

  3. Uncertainty and instream flow standards

    USGS Publications Warehouse

    Castleberry, D.; Cech, J.; Erman, D.; Hankin, D.; Healey, M.; Kondolf, M.; Mengel, M.; Mohr, M.; Moyle, P.; Nielsen, Jennifer L.; Speed, T.; Williams, J.

    1996-01-01

    Several years ago, Science published an important essay (Ludwig et al. 1993) on the need to confront the scientific uncertainty associated with managing natural resources. The essay did not discuss instream flow standards explicitly, but its arguments apply. At an April 1995 workshop in Davis, California, all 12 participants agreed that currently no scientifically defensible method exists for defining the instream flows needed to protect particular species of fish or aquatic ecosystems (Williams, in press). We also agreed that acknowledging this fact is an essential step in dealing rationally and effectively with the problem.Practical necessity and the protection of fishery resources require that new instream flow standards be established and that existing standards be revised. However, if standards cannot be defined scientifically, how can this be done? We join others in recommending the approach of adaptive management. Applied to instream flow standards, this approach involves at least three elements.

  4. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  5. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for themore » discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.« less

  6. Uncertainty in weather and climate prediction

    PubMed Central

    Slingo, Julia; Palmer, Tim

    2011-01-01

    Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation. PMID:22042896

  7. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: Application to a smoking cessation trial

    PubMed Central

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald

    2014-01-01

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315

  8. Uncertainty in hydrological signatures for gauged and ungauged catchments

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  9. Quantifying uncertainties in precipitation measurement

    NASA Astrophysics Data System (ADS)

    Chen, H. Z. D.

    2017-12-01

    The scientific community have a long history of utilizing precipitation data for climate model design. However, precipitation record and its model contains more uncertainty than its temperature counterpart. Literature research have shown precipitation measurements to be highly influenced by its surrounding environment, and weather stations are traditionally situated in open areas and subject to various limitations. As a result, this restriction limits the ability of the scientific community to fully close the loop on the water cycle. Horizontal redistribution have been shown to be a major factor influencing precipitation measurements. Efforts have been placed on reducing its effect on the monitoring apparatus. However, the amount of factors contributing to this uncertainty is numerous and difficult to fully capture. As a result, noise factor remains high in precipitation data. This study aims to quantify all uncertainties in precipitation data by factoring out horizontal redistribution by measuring them directly. Horizontal contribution of precipitation will be quantified by measuring precipitation at different heights, with one directly shadowing the other. The above collection represents traditional precipitation data, whereas the bottom measurements sums up the overall error term at given location. Measurements will be recorded and correlated with nearest available wind measurements to quantify its impact on traditional precipitation record. Collections at different locations will also be compared to see whether this phenomenon is location specific or if a general trend can be derived. We aim to demonstrate a new way to isolate the noise component in traditional precipitation data via empirical measurements. By doing so, improve the overall quality of historic precipitation record. As a result, provide a more accurate information for the design and calibration of large scale climate modeling.

  10. "So a Frackademic and an Environmentalist Walk into an Error Bar...": Communicating Uncertainty Amidst Controversy

    NASA Astrophysics Data System (ADS)

    Kroepsch, A.

    2013-12-01

    above. In striving to separate 'signal' from 'noise' in the public discourse, we have experimented with literary devices (metaphor and narrative), pedagogical tools (the 'what we know, what we don't know, and what we hope to learn' format), journalistic practices (the humanizing profile), and, perhaps most importantly, disarming delivery techniques (humor). In describing these methods, and their effectiveness at addressing scientific uncertainty, the author will be sure to acknowledge the uncertainties inherent therein.

  11. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, E; Sisterson, Douglas

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement

  12. Visualising uncertainty: interpreting quantified geoscientific inversion outputs for a diverse user community.

    NASA Astrophysics Data System (ADS)

    Reading, A. M.; Morse, P. E.; Staal, T.

    2017-12-01

    Geoscientific inversion outputs, such as seismic tomography contour images, are finding increasing use amongst scientific user communities that have limited knowledge of the impact of output parameter uncertainty on subsequent interpretations made from such images. We make use of a newly written computer application which enables seismic tomography images to be displayed in a performant 3D graphics environment. This facilitates the mapping of colour scales to the human visual sensorium for the interactive interpretation of contoured inversion results incorporating parameter uncertainty. Two case examples of seismic tomography inversions or contoured compilations are compared from the southern hemisphere continents of Australia and Antarctica. The Australian example is based on the AuSREM contoured seismic wavespeed model while the Antarctic example is a valuable but less well constrained result. Through adjusting the multiple colour gradients, layer separations, opacity, illumination, shadowing and background effects, we can optimise the insights obtained from the 3D structure in the inversion compilation or result. Importantly, we can also limit the display to show information in a way that is mapped to the uncertainty in the 3D result. Through this practical application, we demonstrate that the uncertainty in the result can be handled through a well-posed mapping of the parameter values to displayed colours in the knowledge of what is perceived visually by a typical human. We found that this approach maximises the chance of a useful tectonic interpretation by a diverse scientific user community. In general, we develop the idea that quantified inversion uncertainty can be used to tailor the way that the output is presented to the analyst for scientific interpretation.

  13. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    PubMed

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  14. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    USDA-ARS?s Scientific Manuscript database

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  15. Communication and perception of uncertainty via graphics in disciplinary and interdisciplinary climate change research

    NASA Astrophysics Data System (ADS)

    Lackner, Bettina C.; Kirchengast, Gottfried

    2015-04-01

    Besides written and spoken language, graphical displays play an important role in communicating scientific findings or explaining scientific methods, both within one and between various disciplines. Uncertainties and probabilities are generally difficult to communicate, especially via graphics. Graphics including uncertainty sometimes need detailed written or oral descriptions to be understood. "Good" graphics should ease scientific communication, especially amongst different disciplines. One key objective of the Doctoral Programme "Climate Change: Uncertainties, Thresholds and Coping Strategies" (http://dk-climate-change.uni-graz.at/en/), located at the University of Graz, is to reach a better understanding of climate change uncertainties by bridging research in multiple disciplines, including physical climate sciences, geosciences, systems and sustainability sciences, environmental economics, and climate ethics. This asks for efforts into the formulation of a "common language", not only as to words, but also as to graphics. The focus of this work is on two topics: (1) What different kinds of uncertainties (e.g., data uncertainty, model uncertainty) are included in the graphics of the recent IPCC reports of all three working groups (WGs) and in what ways do uncertainties get illustrated? (2) How are these graphically displayed uncertainties perceived by researchers of a similar research discipline and from researchers of different disciplines than the authors of the graphics? To answer the first question, the IPCC graphics including uncertainties are grouped and analyzed with respect to different kinds of uncertainties to filter out most of the commonly used types of displays. The graphics will also be analyzed with respect to their WG origin, as we assume that graphics from researchers rooted in, e.g., physical climate sciences and geosciences (mainly IPCC WG 1) differ from those of researchers rooted in, e.g., economics or system sciences (mainly WG 3). In a

  16. Public Perception of Uncertainties Within Climate Change Science.

    PubMed

    Visschers, Vivianne H M

    2018-01-01

    Climate change is a complex, multifaceted problem involving various interacting systems and actors. Therefore, the intensities, locations, and timeframes of the consequences of climate change are hard to predict and cause uncertainties. Relatively little is known about how the public perceives this scientific uncertainty and how this relates to their concern about climate change. In this article, an online survey among 306 Swiss people is reported that investigated whether people differentiate between different types of uncertainty in climate change research. Also examined was the way in which the perception of uncertainty is related to people's concern about climate change, their trust in science, their knowledge about climate change, and their political attitude. The results of a principal component analysis showed that respondents differentiated between perceived ambiguity in climate research, measurement uncertainty, and uncertainty about the future impact of climate change. Using structural equation modeling, it was found that only perceived ambiguity was directly related to concern about climate change, whereas measurement uncertainty and future uncertainty were not. Trust in climate science was strongly associated with each type of uncertainty perception and was indirectly associated with concern about climate change. Also, more knowledge about climate change was related to less strong perceptions of each type of climate science uncertainty. Hence, it is suggested that to increase public concern about climate change, it may be especially important to consider the perceived ambiguity about climate research. Efforts that foster trust in climate science also appear highly worthwhile. © 2017 Society for Risk Analysis.

  17. Lost in Translation: Piloting a Novel Framework to Assess the Challenges in Translating Scientific Uncertainty From Empirical Findings to WHO Policy Statements

    PubMed Central

    Benmarhnia, Tarik; Huang, Jonathan Y.; Jones, Catherine M.

    2017-01-01

    Background: Calls for evidence-informed public health policy, with implicit promises of greater program effectiveness, have intensified recently. The methods to produce such policies are not self-evident, requiring a conciliation of values and norms between policy-makers and evidence producers. In particular, the translation of uncertainty from empirical research findings, particularly issues of statistical variability and generalizability, is a persistent challenge because of the incremental nature of research and the iterative cycle of advancing knowledge and implementation. This paper aims to assess how the concept of uncertainty is considered and acknowledged in World Health Organization (WHO) policy recommendations and guidelines. Methods: We selected four WHO policy statements published between 2008-2013 regarding maternal and child nutrient supplementation, infant feeding, heat action plans, and malaria control to represent topics with a spectrum of available evidence bases. Each of these four statements was analyzed using a novel framework to assess the treatment of statistical variability and generalizability. Results: WHO currently provides substantial guidance on addressing statistical variability through GRADE (Grading of Recommendations Assessment, Development, and Evaluation) ratings for precision and consistency in their guideline documents. Accordingly, our analysis showed that policy-informing questions were addressed by systematic reviews and representations of statistical variability (eg, with numeric confidence intervals). In contrast, the presentation of contextual or "background" evidence regarding etiology or disease burden showed little consideration for this variability. Moreover, generalizability or "indirectness" was uniformly neglected, with little explicit consideration of study settings or subgroups. Conclusion: In this paper, we found that non-uniform treatment of statistical variability and generalizability factors that may

  18. Sensitivity and uncertainty analysis for the annual phosphorus loss estimator model

    USDA-ARS?s Scientific Manuscript database

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  19. 50 CFR 21.23 - Scientific collecting permits.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...

  20. 50 CFR 21.23 - Scientific collecting permits.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...

  1. 50 CFR 21.23 - Scientific collecting permits.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... take, transport, or possess migratory birds, their parts, nests, or eggs for scientific research or... project involved; (4) Name and address of the public, scientific, or educational institution to which all... scientific, or educational institution designated in the permit application within 60 days following the date...

  2. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  3. Progression in Ethical Reasoning When Addressing Socio-Scientific Issues in Biotechnology

    ERIC Educational Resources Information Center

    Berne, Birgitta

    2014-01-01

    This article reports on the outcomes of an intervention in a Swedish school in which the author, a teacher-researcher, sought to develop students' (14-15 years old) ethical reasoning in science through the use of peer discussions about socio-scientific issues. Prior to the student discussions various prompts were used to highlight different…

  4. Soliciting scientific information and beliefs in predictive modeling and adaptive management

    NASA Astrophysics Data System (ADS)

    Glynn, P. D.; Voinov, A. A.; Shapiro, C. D.

    2015-12-01

    Post-normal science requires public engagement and adaptive corrections in addressing issues with high complexity and uncertainty. An adaptive management framework is presented for the improved management of natural resources and environments through a public participation process. The framework solicits the gathering and transformation and/or modeling of scientific information but also explicitly solicits the expression of participant beliefs. Beliefs and information are compared, explicitly discussed for alignments or misalignments, and ultimately melded back together as a "knowledge" basis for making decisions. An effort is made to recognize the human or participant biases that may affect the information base and the potential decisions. In a separate step, an attempt is made to recognize and predict the potential "winners" and "losers" (perceived or real) of any decision or action. These "winners" and "losers" include present human communities with different spatial, demographic or socio-economic characteristics as well as more dispersed or more diffusely characterized regional or global communities. "Winners" and "losers" may also include future human communities as well as communities of other biotic species. As in any adaptive management framework, assessment of predictions, iterative follow-through and adaptation of policies or actions is essential, and commonly very difficult or impossible to achieve. Recognizing beforehand the limits of adaptive management is essential. More generally, knowledge of the behavioral and economic sciences and of ethics and sociology will be key to a successful implementation of this adaptive management framework. Knowledge of biogeophysical processes will also be essential, but by definition of the issues being addressed, will always be incomplete and highly uncertain. The human dimensions of the issues addressed and the participatory processes used carry their own complexities and uncertainties. Some ideas and principles are

  5. Mode-of-Action Uncertainty for Dual-Mode Carcinogens:Lower Bounds for Naphthalene-Induced Nasal Tumors in Rats Implied byPBPK and 2-Stage Stochastic Cancer Risk Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogen, K T

    2007-01-30

    ''uncertainty'' factor <1 appropriate to apply to estimates of naphthalene risk obtained by linear extrapolation under a default genotoxic MOA assumption. This procedure is proposed as scientifically credible method to address MOA uncertainty for DMOA carcinogens.« less

  6. Unquestioned answers or unanswered questions: beliefs about science guide responses to uncertainty in climate change risk communication.

    PubMed

    Rabinovich, Anna; Morton, Thomas A

    2012-06-01

    In two experimental studies we investigated the effect of beliefs about the nature and purpose of science (classical vs. Kuhnian models of science) on responses to uncertainty in scientific messages about climate change risk. The results revealed a significant interaction between both measured (Study 1) and manipulated (Study 2) beliefs about science and the level of communicated uncertainty on willingness to act in line with the message. Specifically, messages that communicated high uncertainty were more persuasive for participants who shared an understanding of science as debate than for those who believed that science is a search for absolute truth. In addition, participants who had a concept of science as debate were more motivated by higher (rather than lower) uncertainty in climate change messages. The results suggest that achieving alignment between the general public's beliefs about science and the style of the scientific messages is crucial for successful risk communication in science. Accordingly, rather than uncertainty always undermining the effectiveness of science communication, uncertainty can enhance message effects when it fits the audience's understanding of what science is. © 2012 Society for Risk Analysis.

  7. Identification and evaluation of scientific uncertainties related to fish and aquatic resources in the Colorado River, Grand Canyon - summary and interpretation of an expert-elicitation questionnaire

    USGS Publications Warehouse

    Kennedy, Theodore A.

    2013-01-01

    Identifying areas of scientific uncertainty is a critical step in the adaptive management process (Walters, 1986; Runge, Converse, and Lyons, 2011). To identify key areas of scientific uncertainty regarding biologic resources of importance to the Glen Canyon Dam Adaptive Management Program, the Grand Canyon Monitoring and Research Center (GCMRC) convened Knowledge Assessment Workshops in May and July 2005. One of the products of these workshops was a set of strategic science questions that highlighted key areas of scientific uncertainty. These questions were intended to frame and guide the research and monitoring activities conducted by the GCMRC in subsequent years. Questions were developed collaboratively by scientists and managers. The questions were not all of equal importance or merit—some questions were large scale and others were small scale. Nevertheless, these questions were adopted and have guided the research and monitoring efforts conducted by the GCMRC since 2005. A new round of Knowledge Assessment Workshops was convened by the GCMRC in June and October 2011 and January 2012 to determine whether the research and monitoring activities conducted since 2005 had successfully answered some of the strategic science questions. Oral presentations by scientists highlighting research findings were a centerpiece of all three of the 2011–12 workshops. Each presenter was also asked to provide an answer to the strategic science questions that were specific to the presenter’s research area. One limitation of this approach is that these answers represented the views of the handful of scientists who developed the presentations, and, as such, they did not incorporate other perspectives. Thus, the answers provided by presenters at the Knowledge Assessment Workshops may not have accurately captured the sentiments of the broader group of scientists involved in research and monitoring of the Colorado River in Glen and Grand Canyons. Yet a fundamental ingredient of

  8. The Crossroads between Biology and Mathematics: The Scientific Method as the Basics of Scientific Literacy

    ERIC Educational Resources Information Center

    Karsai, Istvan; Kampis, George

    2010-01-01

    Biology is changing and becoming more quantitative. Research is creating new challenges that need to be addressed in education as well. New educational initiatives focus on combining laboratory procedures with mathematical skills, yet it seems that most curricula center on a single relationship between scientific knowledge and scientific method:…

  9. Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2012-12-01

    What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid

  10. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  11. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  12. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  13. Optimal infrastructure maintenance scheduling problem under budget uncertainty.

    DOT National Transportation Integrated Search

    2010-05-01

    This research addresses a general class of infrastructure asset management problems. Infrastructure : agencies usually face budget uncertainties that will eventually lead to suboptimal planning if : maintenance decisions are made without taking the u...

  14. Conflicting stories about public scientific controversies: Effects of news convergence and divergence on scientists' credibility.

    PubMed

    Jensen, Jakob D; Hurley, Ryan J

    2012-08-01

    Surveys suggest that approximately one third of news consumers have encountered conflicting reports of the same information. News coverage of science is especially prone to conflict, but how news consumers perceive this situation is currently unknown. College students (N = 242) participated in a lab experiment where they were exposed to news coverage about one of two scientific controversies in the United States: dioxin in sewage sludge or the reintroduction of gray wolves to populated areas. Participants received (a) one news article (control), (b) two news articles that were consistent (convergent), or (c) two news articles that conflicted (divergent). The effects of divergence induced uncertainty differed by news story. Greater uncertainty was associated with increased scientists' credibility ratings for those reading dioxin regulation articles and decreased scientists' credibility ratings for those reading wolf reintroduction articles. Unlike other manifestations of uncertainty in scientific discourse, conflicting stories seem to generate effects that vary significantly by topic. Consistent with uncertainty management theory, uncertainty is embraced or rejected by situation.

  15. Addressing Risk in the Valuation of Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Hammerstrom, Donald J.; Woodward, James T.

    2017-06-26

    Valuation is a mechanism by which potential worth of a transaction between two or more parties can be evaluated. Examples include valuation of transactive energy systems such as electric power system and building energy systems. Uncertainties can manifest while exercising a valuation methodology in the form of lack of knowledge or be inherently embedded in the valuation process. Uncertainty could also exist in the temporal dimension while planning for long-term growth. This paper discusses risk considerations associated with valuation studies in support of decision-making in the presence of such uncertainties. It is often important to have foresight of uncertain entitiesmore » that can impact real-world deployments, such as the comparison or ranking of two valuation studies to determine cost-benefit impacts to multiple stakeholders. The research proposes to address this challenge through simulation and sensitivity analyses to support ‘what-if’ analysis of well-defined future scenarios. This paper describes foundational value of diagrammatic representation techniques such as unified modeling language to understand the implications of not addressing some of the risk elements encountered during the valuation process. The paper includes examples from generation resource adequacy assessment studies (e.g. loss of load) to illustrate the principles of risk in valuation.« less

  16. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    NASA Astrophysics Data System (ADS)

    Datta, D.

    2010-10-01

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  17. Multi-model ensembles for assessment of flood losses and associated uncertainty

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Schröter, Kai; Weiss-Motz, Alexander; Martina, Mario L. V.; Kreibich, Heidi

    2018-05-01

    Flood loss modelling is a crucial part of risk assessments. However, it is subject to large uncertainty that is often neglected. Most models available in the literature are deterministic, providing only single point estimates of flood loss, and large disparities tend to exist among them. Adopting any one such model in a risk assessment context is likely to lead to inaccurate loss estimates and sub-optimal decision-making. In this paper, we propose the use of multi-model ensembles to address these issues. This approach, which has been applied successfully in other scientific fields, is based on the combination of different model outputs with the aim of improving the skill and usefulness of predictions. We first propose a model rating framework to support ensemble construction, based on a probability tree of model properties, which establishes relative degrees of belief between candidate models. Using 20 flood loss models in two test cases, we then construct numerous multi-model ensembles, based both on the rating framework and on a stochastic method, differing in terms of participating members, ensemble size and model weights. We evaluate the performance of ensemble means, as well as their probabilistic skill and reliability. Our results demonstrate that well-designed multi-model ensembles represent a pragmatic approach to consistently obtain more accurate flood loss estimates and reliable probability distributions of model uncertainty.

  18. 48 CFR 435.010 - Scientific and technical reports.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CATEGORIES OF CONTRACTING RESEARCH AND DEVELOPMENT CONTRACTING 435.010 Scientific and technical reports... all scientific and technical reports to the National Technical Information Service at the address... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Scientific and technical...

  19. Geometric state space uncertainty as a new type of uncertainty addressing disparity in ';emergent properties' between real and modeled systems

    NASA Astrophysics Data System (ADS)

    Montero, J. T.; Lintz, H. E.; Sharp, D.

    2013-12-01

    Do emergent properties that result from models of complex systems match emergent properties from real systems? This question targets a type of uncertainty that we argue requires more attention in system modeling and validation efforts. We define an ';emergent property' to be an attribute or behavior of a modeled or real system that can be surprising or unpredictable and result from complex interactions among the components of a system. For example, thresholds are common across diverse systems and scales and can represent emergent system behavior that is difficult to predict. Thresholds or other types of emergent system behavior can be characterized by their geometry in state space (where state space is the space containing the set of all states of a dynamic system). One way to expedite our growing mechanistic understanding of how emergent properties emerge from complex systems is to compare the geometry of surfaces in state space between real and modeled systems. Here, we present an index (threshold strength) that can quantify a geometric attribute of a surface in state space. We operationally define threshold strength as how strongly a surface in state space resembles a step or an abrupt transition between two system states. First, we validated the index for application in greater than three dimensions of state space using simulated data. Then, we demonstrated application of the index in measuring geometric state space uncertainty between a real system and a deterministic, modeled system. In particular, we looked at geometric space uncertainty between climate behavior in 20th century and modeled climate behavior simulated by global climate models (GCMs) in the Coupled Model Intercomparison Project phase 5 (CMIP5). Surfaces from the climate models came from running the models over the same domain as the real data. We also created response surfaces from a real, climate data based on an empirical model that produces a geometric surface of predicted values in state

  20. 76 FR 54197 - Census Scientific Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ...-SAC). The Committee will address policy, research, and technical issues relating to a full range of... scientific and technical expertise, as appropriate, to address Census Bureau program needs and objectives...

  1. Guide to Scientific Instruments

    ERIC Educational Resources Information Center

    Sommer, Richard G.; Scherago, Earl J.

    1977-01-01

    Provides a list of scientific laboratory instruments and equipment and the names and addresses of their manufacturers. Instruments appear in alphabetical order with the names of manufactures listed below each. (SL)

  2. To be or not to be: How do we speak about uncertainty in public?

    NASA Astrophysics Data System (ADS)

    Todesco, Micol; Lolli, Barbara; Sheldrake, Tom; Odbert, Henry

    2016-04-01

    One of the challenges related to hazard communication concerns the public perception and understanding of scientific uncertainties, and of its implications in terms of hazard assessment and mitigation. Often science is perceived as an effective dispenser of resolving answers to the main issues posed by the complexities of life and nature. In this perspective, uncertainty is seen as a pernicious lack of knowledge that hinders our ability to face complex problems. From a scientific perspective, however, the definition of uncertainty is the only valuable tool we have to handle errors affecting our data and propagating through the increasingly complex models we develop to describe reality. Through uncertainty, scientists acknowledge the great variability that characterises natural systems and account for it in their assessment of possible scenarios. From this point of view, uncertainty is not ignorance, but it rather provides a great deal of information that is needed to inform decision making. To find effective ways to bridge the gap between these different meaning of uncertainty, we asked high-school students for assistance. With their help, we gathered definitions of the term 'uncertainty' interviewing different categories of peoples, including schoolmates and professors, neighbours, families and friends. These definitions will be compared with those provided by scientists, to find differences and similarity. To understand the role of uncertainty on judgment, a hands-on experiment is performed where students will have to estimate the exact time of explosion of party poppers subjected to a variable degree of pull. At the end of the project, the students will express their own understanding of uncertainty in a video, which will be made available for sharing. Materials collected during all the activities will contribute to our understanding of how uncertainty is portrayed and can be better expressed to improve our hazard communication.

  3. The Scientific Competitiveness of Nations.

    PubMed

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society.

  4. The Scientific Competitiveness of Nations

    PubMed Central

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation—that is, the competitiveness of its research system—and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of “markers” of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most “sophisticated” needs of the society. PMID:25493626

  5. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  7. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  8. Addressing the Dynamics of Science in Curricular Reform for Scientific Literacy: The Case of Genomics

    ERIC Educational Resources Information Center

    van Eijck, Michiel

    2010-01-01

    Science education reform must anticipate the scientific literacy required by the next generation of citizens. Particularly, this counts for rapidly emerging and evolving scientific disciplines such as genomics. Taking this discipline as a case, such anticipation is becoming increasingly problematic in today's knowledge societies in which the…

  9. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  10. Uncertainty in Climate Change Research: An Integrated Approach

    NASA Astrophysics Data System (ADS)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  11. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-03-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target-measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models as well as greenhouse gas scenarios are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure Adequate Human livelihood conditions for wEll-being And Development (AHEAD). Based on a transdisciplinary sample of influential concepts addressing human well-being, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows identifying and differentiating uncertainty of climate and impact model projections. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that in many countries today, livelihood conditions are compromised by water scarcity. However, more often, AHEAD fulfilment is limited through other elements. Moreover, the analysis shows that for 44 out of 111 countries, the water-specific uncertainty ranges are

  12. How Navigating Uncertainty Motivates Trust in Medicine.

    PubMed

    Imber, Jonathan B

    2017-04-01

    Three significant factors in the shaping of modern medicine contribute to broad perceptions about trust in the patient-physician relationship: moral, professional, and epidemiological uncertainty. Trusting a physician depends first on trusting a person, then trusting a person's skills and training, and finally trusting the science that underwrites those skills. This essay, in part based on my book, Trusting Doctors: The Decline of Moral Authority in American Medicine (Princeton University Press, 2008), will address the forms of uncertainty that contribute to the nature of difficult encounters in the patient-physician relationship. © 2017 American Medical Association. All Rights Reserved.

  13. Ocean state and uncertainty forecasts using HYCOM with Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, Mozheng; Hogan, Pat; Rowley, Clark; Smedstad, Ole-Martin; Wallcraft, Alan; Penny, Steve

    2017-04-01

    An ensemble forecast system based on the US Navy's operational HYCOM using Local Ensemble Transfer Kalman Filter (LETKF) technology has been developed for ocean state and uncertainty forecasts. One of the advantages is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates the operational observations using ensemble method. The background covariance during this assimilation process is supplied with the ensemble, thus it avoids the difficulty of developing tangent linear and adjoint models for 4D-VAR from the complicated hybrid isopycnal vertical coordinate in HYCOM. Another advantage is that the ensemble system provides the valuable uncertainty estimate corresponding to every state forecast from HYCOM. Uncertainty forecasts have been proven to be critical for the downstream users and managers to make more scientifically sound decisions in numerical prediction community. In addition, ensemble mean is generally more accurate and skilful than the single traditional deterministic forecast with the same resolution. We will introduce the ensemble system design and setup, present some results from 30-member ensemble experiment, and discuss scientific, technical and computational issues and challenges, such as covariance localization, inflation, model related uncertainties and sensitivity to the ensemble size.

  14. Progression in Ethical Reasoning When Addressing Socio-scientific Issues in Biotechnology

    NASA Astrophysics Data System (ADS)

    Berne, Birgitta

    2014-11-01

    This article reports on the outcomes of an intervention in a Swedish school in which the author, a teacher-researcher, sought to develop students' (14-15 years old) ethical reasoning in science through the use of peer discussions about socio-scientific issues. Prior to the student discussions various prompts were used to highlight different aspects of the issues. In addition, students were given time to search for further information themselves. Analysis of students' written arguments, from the beginning of the intervention and afterwards, suggests that many students seem to be moving away from their use of everyday language towards using scientific concepts in their arguments. In addition, they moved from considering cloning and 'designer babies' solely in terms of the present to considering them in terms of the future. Furthermore, the students started to approach the issues in additional ways using not only consequentialism but also the approaches of virtue ethics, and rights and duties. Students' progression in ethical reasoning could be related to the characteristics of the interactions in peer discussions as students who critically and constructively argued with each other's ideas, and challenged each other's claims, made progress in more aspects of ethical reasoning than students merely using cumulative talk. As such, the work provides valuable indications for the importance of introducing peer discussions and debates about SSIs in connection to biotechnology into the teaching of science in schools.

  15. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  16. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000

  17. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  18. Uncertainty in Bohr's response to the Heisenberg microscope

    NASA Astrophysics Data System (ADS)

    Tanona, Scott

    2004-09-01

    In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.

  19. Presentation of uncertainties on web platforms for climate change information

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Wrobel, Markus; Reusser, Dominik

    2014-05-01

    Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.

  20. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-10-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models, as well as greenhouse gas scenarios, are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure what is referred to here as AHEAD (Adequate Human livelihood conditions for wEll-being And Development). Based on a trans-disciplinary sample of concepts addressing human well-being and livelihoods, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows for the uncertainty of climate and impact model projections to be identified and differentiated. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that livelihood conditions are compromised by water scarcity in 34 countries. However, more often, AHEAD fulfilment is limited through other elements. The analysis shows that the water-specific uncertainty ranges of the

  1. I Am Sure There May Be a Planet There: Student Articulation of Uncertainty in Argumentation Tasks

    ERIC Educational Resources Information Center

    Buck, Zoë E.; Lee, Hee-Sun; Flores, Joanna

    2014-01-01

    We investigated how students articulate uncertainty when they are engaged in structured scientific argumentation tasks where they generate, examine, and interpret data to determine the existence of exoplanets. In this study, 302 high school students completed 4 structured scientific arguments that followed a series of computer-model-based…

  2. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  3. EPA scientific integrity policy draft

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-08-01

    The U.S. Environmental Protection Agency (EPA) issued its draft scientific integrity policy on 5 August. The draft policy addresses scientific ethical standards, communications with the public, the use of advisory committees and peer review, and professional development. The draft policy was developed by an ad hoc group of EPA senior staff and scientists in response to a December 2010 memorandum on scientific integrity from the White House Office of Science and Technology Policy. The agency is accepting public comments on the draft through 6 September; comments should be sent to osa.staff@epa.gov. For more information, see http://www.epa.gov/stpc/pdfs/draft-scientific-integrity-policy-aug2011.pdf.

  4. A taxonomy of medical uncertainties in clinical genome sequencing.

    PubMed

    Han, Paul K J; Umstead, Kendall L; Bernhardt, Barbara A; Green, Robert C; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B; Biesecker, Leslie G; Biesecker, Barbara B

    2017-08-01

    Clinical next-generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of an unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts and themes were extracted in order to expand on a previously published three-dimensional taxonomy of medical uncertainty. In parallel, we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. The proposed taxonomy divides uncertainty along three axes-source, issue, and locus-and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS.Genet Med advance online publication 19 January 2017.

  5. A Taxonomy of Medical Uncertainties in Clinical Genome Sequencing

    PubMed Central

    Han, Paul K. J.; Umstead, Kendall L.; Bernhardt, Barbara A.; Green, Robert C.; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B.; Biesecker, Leslie G.; Biesecker, Barbara B.

    2017-01-01

    Purpose Clinical next generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Methods Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts, and themes were extracted in order to expand upon a previously published three-dimensional taxonomy of medical uncertainty. In parallel we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. Results The proposed taxonomy divides uncertainty along three axes: source, issue, and locus, and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. Conclusion The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS. PMID:28102863

  6. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a

  7. Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence

    PubMed Central

    Han, Paul K. J.

    2014-01-01

    The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891

  8. Mars 2001 Mission: Addressing Scientific Questions Regarding the Characteristics and Origin of Local Bedrock and Soil

    NASA Technical Reports Server (NTRS)

    Saunders, R. S.; Arvidson, R. E.; Weitz, C. M.; Marshall, J.; Squyres, S. W.; Christensen, P. R.; Meloy, T.; Smith, P.

    1999-01-01

    The Mars Surveyor Program 2001 Mission will carry instruments on the orbiter, lander and rover that will support synergistic observations and experiments to address important scientific questions regarding the local bedrock and soils. The martian surface is covered in varying degrees by fine materials less than a few mms in size. Viking and Pathfinder images of the surface indicate that soils at those sites are composed of fine particles. Wheel tracks from the Sojourner rover suggest that soil deposits are composed of particles <40 mm. Viking images show that dunes are common in many areas on Mars and new MOC images indicate that dunes occur nearly everywhere. Dunes on Mars are thought to be composed of 250-500 microns particles based upon Viking IRTM data and Mars wind tunnel experiments. If martian dunes are composed of sand particles > 100 microns and soils are dominated by <10 micron particles, then where are the intermediate grain sizes? Have they been wom away through prolonged transport over the eons? Were they never generated to begin with? Or are they simply less easy to identify because do they not form distinctive geomorphic features such as dunes or uniform mantles that tend to assume superposition in the soil structure?

  9. Scientific Integrity and Professional Ethics at AGU - The Establishment and Evolution of an Ethics Program at a Large Scientific Society

    NASA Astrophysics Data System (ADS)

    McPhaden, Michael; Leinen, Margaret; McEntee, Christine; Townsend, Randy; Williams, Billy

    2016-04-01

    The American Geophysical Union, a scientific society of 62,000 members worldwide, has established a set of scientific integrity and professional ethics guidelines for the actions of its members, for the governance of the union in its internal activities, and for the operations and participation in its publications and scientific meetings. This presentation will provide an overview of the Ethics program at AGU, highlighting the reasons for its establishment, the process of dealing ethical breaches, the number and types of cases considered, how AGU helps educate its members on Ethics issues, and the rapidly evolving efforts at AGU to address issues related to the emerging field of GeoEthics. The presentation will also cover the most recent AGU Ethics program focus on the role for AGU and other scientific societies in addressing sexual harassment, and AGU's work to provide additional program strength in this area.

  10. Synthesis and Control of Flexible Systems with Component-Level Uncertainties

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Lim, Kyong B.

    2009-01-01

    An efficient and computationally robust method for synthesis of component dynamics is developed. The method defines the interface forces/moments as feasible vectors in transformed coordinates to ensure that connectivity requirements of the combined structure are met. The synthesized system is then defined in a transformed set of feasible coordinates. The simplicity of form is exploited to effectively deal with modeling parametric and non-parametric uncertainties at the substructure level. Uncertainty models of reasonable size and complexity are synthesized for the combined structure from those in the substructure models. In particular, we address frequency and damping uncertainties at the component level. The approach first considers the robustness of synthesized flexible systems. It is then extended to deal with non-synthesized dynamic models with component-level uncertainties by projecting uncertainties to the system level. A numerical example is given to demonstrate the feasibility of the proposed approach.

  11. Technical Evaluation Report for Symposium AVT-147: Computational Uncertainty in Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Radespiel, Rolf; Hemsch, Michael J.

    2007-01-01

    The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.

  12. Essentiality, toxicity, and uncertainty in the risk assessment of manganese.

    PubMed

    Boyes, William K

    2010-01-01

    Risk assessments of manganese by inhalation or oral routes of exposure typically acknowledge the duality of manganese as an essential element at low doses and a toxic metal at high doses. Previously, however, risk assessors were unable to describe manganese pharmacokinetics quantitatively across dose levels and routes of exposure, to account for mass balance, and to incorporate this information into a quantitative risk assessment. In addition, the prior risk assessment of inhaled manganese conducted by the U.S. Environmental Protection Agency (EPA) identified a number of specific factors that contributed to uncertainty in the risk assessment. In response to a petition regarding the use of a fuel additive containing manganese, methylcyclopentadienyl manganese tricarbonyl (MMT), the U.S. EPA developed a test rule under the U.S. Clean Air Act that required, among other things, the generation of pharmacokinetic information. This information was intended not only to aid in the design of health outcome studies, but also to help address uncertainties in the risk assessment of manganese. To date, the work conducted in response to the test rule has yielded substantial pharmacokinetic data. This information will enable the generation of physiologically based pharmacokinetic (PBPK) models capable of making quantitative predictions of tissue manganese concentrations following inhalation and oral exposure, across dose levels, and accounting for factors such as duration of exposure, different species of manganese, and changes of age, gender, and reproductive status. The work accomplished in response to the test rule, in combination with other scientific evidence, will enable future manganese risk assessments to consider tissue dosimetry more comprehensively than was previously possible.

  13. Characterization of the energy-dependent uncertainty and correlation in silicon neutron displacement damage metrics

    NASA Astrophysics Data System (ADS)

    Griffin, Patrick; Rochman, Dimitri; Koning, Arjan

    2017-09-01

    A rigorous treatment of the uncertainty in the underlying nuclear data on silicon displacement damage metrics is presented. The uncertainty in the cross sections and recoil atom spectra are propagated into the energy-dependent uncertainty contribution in the silicon displacement kerma and damage energy using a Total Monte Carlo treatment. An energy-dependent covariance matrix is used to characterize the resulting uncertainty. A strong correlation between different reaction channels is observed in the high energy neutron contributions to the displacement damage metrics which supports the necessity of using a Monte Carlo based method to address the nonlinear nature of the uncertainty propagation.

  14. A Commentary on Innovation and Emerging Scientific Careers: Is Social Work Prepared to Compete in Today's Scientific Marketplace?

    ERIC Educational Resources Information Center

    Craddock, Jaih B.

    2017-01-01

    The aim of this article is to address some of the questions Dr. Paula S. Nurius presents in her article, "Innovation and Emerging Scientific Careers: Is Social Work Prepared to Compete in Today?s Scientific Marketplace?" Specifically, this article will focus on what we can do to better prepare our emerging research scholars to be…

  15. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT 1.1 alpha)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2012-04-01

    A wide variety of different plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. Parameter uncertainty has been widely addressed by calibrating models at data-rich ocean sites. However, relatively little attention has been given to quantifying uncertainty in the physical fields required by the plankton models at these sites, and tendencies in the biogeochemical properties due to the effects of horizontal processes are often neglected. Here we use model twin experiments, in which synthetic data are assimilated to estimate a system's known "true" parameters, to investigate the impact of error in a plankton model's environmental input data. The experiments are supported by a new software tool, the Marine Model Optimization Testbed, designed for rigorous analysis of plankton models in a multi-site 1-D framework. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergence tendencies of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error variance over an annual cycle, indicating variation in the significance attributable to individual model-data differences. An inverse scheme using ensemble-based estimates of the

  16. Balancing Certainty and Uncertainty in Clinical Practice

    ERIC Educational Resources Information Center

    Kamhi, Alan G.

    2011-01-01

    Purpose: In this epilogue, I respond to each of the five commentaries, discussing in some depth a central issue raised in each commentary. In the final section, I discuss how my thinking about certainty and uncertainty in clinical practice has evolved since I wrote the initial article. Method: Topics addressed include the similarities/differences…

  17. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  18. Uncertainty law in ambient modal identification-Part I: Theory

    NASA Astrophysics Data System (ADS)

    Au, Siu-Kui

    2014-10-01

    Ambient vibration test has gained increasing popularity in practice as it provides an economical means for modal identification without artificial loading. Since the signal-to-noise ratio cannot be directly controlled, the uncertainty associated with the identified modal parameters is a primary concern. From a scientific point of view, it is of interest to know on what factors the uncertainty depends and what the relationship is. For planning or specification purposes, it is desirable to have an assessment of the test configuration required to achieve a specified accuracy in the modal parameters. For example, what is the minimum data duration to achieve a 30% coefficient of variation (c.o.v.) in the damping ratio? To address these questions, this work investigates the leading order behavior of the ‘posterior uncertainties’ (i.e., given data) of the modal parameters in a Bayesian identification framework. In the context of well-separated modes, small damping and sufficient data, it is shown rigorously that, among other results, the posterior c.o.v. of the natural frequency and damping ratio are asymptotically equal to ( and 1/(2, respectively; where ζ is the damping ratio; Nc is the data length as a multiple of the natural period; Bf and Bζ are data length factors that depend only on the bandwidth utilized for identification, for which explicit expressions have been derived. As the Bayesian approach allows full use of information contained in the data, the results are fundamental characteristics of the ambient modal identification problem. This paper develops the main theory. The companion paper investigates the implication of the results and verification with field test data.

  19. Generally Recognized as Safe: Uncertainty Surrounding E-Cigarette Flavoring Safety.

    PubMed

    Sears, Clara G; Hart, Joy L; Walker, Kandi L; Robertson, Rose Marie

    2017-10-23

    Despite scientific uncertainty regarding the relative safety of inhaling e-cigarette aerosol and flavorings, some consumers regard the U.S. Food and Drug Administration's "generally recognized as safe" (GRAS) designation as evidence of flavoring safety. In this study, we assessed how college students' perceptions of e-cigarette flavoring safety are related to understanding of the GRAS designation. During spring 2017, an online questionnaire was administered to college students. Chi-square p -values and multivariable logistic regression were employed to compare perceptions among participants considering e-cigarette flavorings as safe and those considering e-cigarette flavorings to be unsafe. The total sample size was 567 participants. Only 22% knew that GRAS designation meant that a product is safe to ingest, not inhale, inject, or use topically. Of participants who considered flavorings to be GRAS, the majority recognized that the designation meant a product is safe to ingest but also considered it safe to inhale. Although scientific uncertainty on the overall safety of flavorings in e-cigarettes remains, health messaging can educate the public about the GRAS designation and its irrelevance to e-cigarette safety.

  20. Quantifying geological uncertainty for flow and transport modeling in multi-modal heterogeneous formations

    NASA Astrophysics Data System (ADS)

    Feyen, Luc; Caers, Jef

    2006-06-01

    In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport

  1. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    NASA Astrophysics Data System (ADS)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  2. Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma

    2010-01-01

    In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.

  3. Tutorial examples for uncertainty quantification methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  4. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Treesearch

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  5. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  6. A review of uncertainty visualization within the IPCC reports

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Reusser, Dominik; Wrobel, Markus

    2015-04-01

    Results derived from climate model simulations confront non-expert users with a variety of uncertainties. This gives rise to the challenge that the scientific information must be communicated such that it can be easily understood, however, the complexity of the science behind is still incorporated. With respect to the assessment reports of the IPCC, the situation is even more complicated, because heterogeneous sources and multiple types of uncertainties need to be compiled together. Within this work, we systematically (1) analyzed the visual representation of uncertainties in the IPCC AR4 and AR5 reports, and (2) executed a questionnaire to evaluate how different user groups such as decision-makers and teachers understand these uncertainty visualizations. Within the first step, we classified visual uncertainty metaphors for spatial, temporal and abstract representations. As a result, we clearly identified a high complexity of the IPCC visualizations compared to standard presentation graphics, sometimes even integrating two or more uncertainty classes / measures together with the "certain" (mean) information. Further we identified complex written uncertainty explanations within image captions even within the "summary reports for policy makers". In the second step, based on these observations, we designed a questionnaire to investigate how non-climate experts understand these visual representations of uncertainties, how visual uncertainty coding might hinder the perception of the "non-uncertain" data, and if alternatives for certain IPCC visualizations exist. Within the talk/poster, we will present first results from this questionnaire. Summarizing, we identified a clear trend towards complex images within the latest IPCC reports, with a tendency to incorporate as much as possible information into the visual representations, resulting in proprietary, non-standard graphic representations that are not necessarily easy to comprehend on one glimpse. We conclude that

  7. Addressing model uncertainty through stochastic parameter perturbations within the High Resolution Rapid Refresh (HRRR) ensemble

    NASA Astrophysics Data System (ADS)

    Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.

    2016-12-01

    It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support

  8. Error Analysis of CM Data Products Sources of Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less

  9. Shaping scientific attitude of biology education students through research-based teaching

    NASA Astrophysics Data System (ADS)

    Firdaus, Darmadi

    2017-08-01

    Scientific attitude is need of today's society for peaceful and meaningful living of every person in a multicultural world. A case study was conducted at the Faculty of Teacher Training and Education, University of Riau, Pekanbaru in order to describe the scientific attitude that shaped by research-based teaching (RBT). Eighteen students of English for Biology bilingual program were selected from 88 regular students as a subject of the study. RBT designed consists of 9 steps: 1) field observations, 2) developing research proposals, 3) research proposal seminar, 4) field data collecting, 5) data analyzing & ilustrating, 6) writing research papers, 7) preparing power point slides, 8) creating a scientific poster, 9) seminar & poster session. Data were collected by using check list observation instuments during 14 weeks (course sessions), then analyzed by using descriptive-quantitative method. The results showed that RBT were able to shape critical-mindedness, suspended judgement, respect for evidence, honesty, objectivity, and questioning attitude as well as tolerance of uncertainty. These attitudes which shaped were varies according to every steps of learning activities. It's seems that the preparation of scientific posters and research seminar quite good in shaping the critical-mindedness, suspended judgment, respect for evidence, honesty, objectivity, and questioning attitude, as well as tolerance of uncertainty. In conclusion, the application of research-based teaching through the English for Biology courses could shape the students scientific attitudes. However, the consistency of the appearance of a scientific attitude in every stage of Biology-based RBT learning process need more intensive and critical assessment.

  10. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  11. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  12. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.

  13. Effectively Communicating the Uncertainties Surrounding Ebola Virus Transmission.

    PubMed

    Kilianski, Andy; Evans, Nicholas G

    2015-10-01

    The current Ebola virus outbreak has highlighted the uncertainties surrounding many aspects of Ebola virus virology, including routes of transmission. The scientific community played a leading role during the outbreak-potentially, the largest of its kind-as many of the questions surrounding ebolaviruses have only been interrogated in the laboratory. Scientists provided an invaluable resource for clinicians, public health officials, policy makers, and the lay public in understanding the progress of Ebola virus disease and the continuing outbreak. Not all of the scientific communication, however, was accurate or effective. There were multiple instances of published articles during the height of the outbreak containing potentially misleading scientific language that spurred media overreaction and potentially jeopardized preparedness and policy decisions at critical points. Here, we use articles declaring the potential for airborne transmission of Ebola virus as a case study in the inaccurate reporting of basic science, and we provide recommendations for improving the communication about unknown aspects of disease during public health crises.

  14. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  15. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  16. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  17. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  18. The optimisation approach of ALARA in nuclear practice: an early application of the precautionary principle. Scientific uncertainty versus legal uncertainty.

    PubMed

    Lierman, S; Veuchelen, L

    2005-01-01

    The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law.

  19. Demand for command: responding to technological risks and scientific uncertainties.

    PubMed

    Stokes, Elen

    2013-01-01

    This article seeks to add to current theories of new governance by highlighting the predicament facing regulators and regulatees when dealing with new technologies. Using nanotechnologies as a study, it shows that new modes of governance (as opposed to traditional coercive, or command and control regulation) offer promising solutions to highly complex, uncertain, and contested problems of risk, such as those associated with new technologies. In this regard, nanotechnologies provide a useful test bed for the ambitions of newer, better modes of governance because there are not yet any fixed ideas about the appropriate course of action. The article suggests, however, that examples of new governance are less prominent than perhaps expected. Drawing on empirical data, it argues that, when faced with considerable epistemological, political, economic, and ethical uncertainties, regulatory stakeholders often exhibit a preference for more conventional command methods of regulation. That is not to say that new governance is entirely absent from regulatory policies on nanotechnologies, but that new governance is emerging in perhaps more subtle ways than the scholarly and policy literature predicted.

  20. Representing uncertainty in objective functions: extension to include the influence of serial correlation

    NASA Astrophysics Data System (ADS)

    Croke, B. F.

    2008-12-01

    The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.

  1. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  2. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  3. Uncertainty Calculations in the First Introductory Physics Laboratory

    NASA Astrophysics Data System (ADS)

    Rahman, Shafiqur

    2005-03-01

    Uncertainty in a measured quantity is an integral part of reporting any experimental data. Consequently, Introductory Physics laboratories at many institutions require that students report the values of the quantities being measured as well as their uncertainties. Unfortunately, given that there are three main ways of calculating uncertainty, each suitable for particular situations (which is usually not explained in the lab manual), this is also an area that students feel highly confused about. It frequently generates large number of complaints in the end-of-the semester course evaluations. Students at some institutions are not asked to calculate uncertainty at all, which gives them a fall sense of the nature of experimental data. Taking advantage of the increased sophistication in the use of computers and spreadsheets that students are coming to college with, we have completely restructured our first Introductory Physics Lab to address this problem. Always in the context of a typical lab, we now systematically and sequentially introduce the various ways of calculating uncertainty including a theoretical understanding as opposed to a cookbook approach, all within the context of six three-hour labs. Complaints about the lab in student evaluations have dropped by 80%. * supported by a grant from A. V. Davis Foundation

  4. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of

  5. Scientific impact: opportunity and necessity.

    PubMed

    Cohen, Marlene Z; Alexander, Gregory L; Wyman, Jean F; Fahrenwald, Nancy L; Porock, Davina; Wurzbach, Mary E; Rawl, Susan M; Conn, Vicki S

    2010-08-01

    Recent National Institutes of Health changes have focused attention on the potential scientific impact of research projects. Research with the excellent potential to change subsequent science or health care practice may have high scientific impact. Only rigorous studies that address highly significant problems can generate change. Studies with high impact may stimulate new research approaches by changing understanding of a phenomenon, informing theory development, or creating new research methods that allow a field of science to move forward. Research with high impact can transition health care to more effective and efficient approaches. Studies with high impact may propel new policy developments. Research with high scientific impact typically has both immediate and sustained influence on the field of study. The article includes ideas to articulate potential scientific impact in grant applications as well as possible dissemination strategies to enlarge the impact of completed projects.

  6. Making framing of uncertainty in water management practice explicit by using a participant-structured approach.

    PubMed

    Isendahl, Nicola; Dewulf, Art; Pahl-Wostl, Claudia

    2010-01-01

    By now, the need for addressing uncertainty in the management of water resources is widely recognized, yet there is little expertise and experience how to effectively deal with uncertainty in practice. Uncertainties in water management practice so far are mostly dealt with intuitively or based on experience. That way decisions can be quickly taken but analytic processes of deliberate reasoning are bypassed. To meet the desire of practitioners for better guidance and tools how to deal with uncertainty more practice-oriented systematic approaches are needed. For that purpose we consider it important to understand how practitioners frame uncertainties. In this paper we present an approach where water managers developed criteria of relevance to understand and address uncertainties. The empirical research took place in the Doñana region of the Guadalquivir estuary in southern Spain making use of the method of card sorting. Through the card sorting exercise a broad range of criteria to make sense of and describe uncertainties was produced by different subgroups, which were then merged into a shared list of criteria. That way framing differences were made explicit and communication on uncertainty and on framing differences was enhanced. In that, the present approach constitutes a first step to enabling reframing and overcoming framing differences, which are important features on the way to robust decision-making. Moreover, the elaborated criteria build a basis for the development of more structured approaches to deal with uncertainties in water management practice. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    NASA Astrophysics Data System (ADS)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  8. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, C. J.

    2017-12-01

    demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.

  9. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while

  10. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  11. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  12. Media portrayal of prenatal and postpartum marijuana use in an era of scientific uncertainty.

    PubMed

    Jarlenski, Marian; Koma, Jonathan W; Zank, Jennifer; Bodnar, Lisa M; Tarr, Jill A; Chang, Judy C

    2018-06-01

    Objectives were to characterize how scientific information about prenatal and postpartum marijuana use was presented in online media content, and to assess how media portrayed risks and benefits of such marijuana use. We analyzed online media items (n = 316) from March 2015 to January 2017. A codebook was developed to measure media content in 4 domains: scientific studies, information about health and well-being, mode of ingestion, and portrayal of risks and benefits. Content analysis was performed by two authors, with high inter-rater reliability (mean ĸ = 0.82). Descriptive statistics were used to characterize content, and regression analyses were used to test for predictors of media portrayal of the risk-benefit ratio of prenatal and postpartum marijuana use. 51% of the media items mentioned health risks of prenatal and postpartum marijuana use. Nearly one-third (28%) mentioned marijuana use for treatment of nausea and vomiting in pregnancy. Most media items mentioned a specific research study. More than half of media (59%) portrayed prenatal or postpartum marijuana risks > benefits, 10% portrayed benefits> risks, and the remainder were neutral. While mention of a scientific study was not predictive of the portrayal of the risk-benefit ratio of marijuana use in pregnancy or postpartum, discussion of health risks and health benefits predicted portrayals of the risk-benefit ratio. Online media content about prenatal and postpartum marijuana use presented health risks consistent with evidence, and discussed a health benefit of marijuana use for nausea and vomiting in pregnancy. Portrayal of risks and benefits was somewhat equivocal, consistent with current scientific debate. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Medical Humanities: The Rx for Uncertainty?

    PubMed

    Ofri, Danielle

    2017-12-01

    While medical students often fear the avalanche of knowledge they are required to learn during training, it is learning to translate that knowledge into wisdom that is the greatest challenge of becoming a doctor. Part of that challenge is learning to tolerate ambiguity and uncertainty, a difficult feat for doctors who are taught to question anything that is not evidence based or peer reviewed. The medical humanities specialize in this ambiguity and uncertainty, which are hallmarks of actual clinical practice but rarely addressed in medical education. The humanities also force reflection and contemplation-skills that are crucial to thoughtful decision making and to personal wellness. Beyond that, the humanities add a dose of joy and beauty to a training process that is notoriously frugal in these departments. Well integrated, the humanities can be the key to transforming medical knowledge into clinical wisdom.

  14. Challenges in studying the effects of scientific societies on research integrity.

    PubMed

    Levine, Felice J; Iutcovich, Joyce M

    2003-04-01

    Beyond impressionistic observations, little is known about the role and influence of scientific societies on research conduct. Acknowledging that the influence of scientific societies is not easily disentangled from other factors that shape norms and practices, this article addresses how best to study the promotion of research integrity generally as well as the role and impact of scientific societies as part of that process. In setting forth the parameters of a research agenda, the article addresses four issues: (1) how to conceptualize research on scientific societies and research integrity; (2) challenges and complexities in undertaking basic research; (3) strategies for undertaking basic research that is attentive to individual, situational, organizational, and environmental levels of analysis; and (4) the need for evaluation research as integral to programmatic change and to assessment of the impact of activities by scientific societies.

  15. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  16. Fifth International Conference on Squeezed States and Uncertainty Relations

    NASA Technical Reports Server (NTRS)

    Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)

    1998-01-01

    The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.

  17. Eliciting climate experts' knowledge to address model uncertainties in regional climate projections: a case study of Guanacaste, Northwest Costa Rica

    NASA Astrophysics Data System (ADS)

    Grossmann, I.; Steyn, D. G.

    2014-12-01

    Global general circulation models typically cannot provide the detailed and accurate regional climate information required by stakeholders for climate adaptation efforts, given their limited capacity to resolve the regional topography and changes in local sea surface temperature, wind and circulation patterns. The study region in Northwest Costa Rica has a tropical wet-dry climate with a double-peak wet season. During the dry season the central Costa Rican mountains prevent tropical Atlantic moisture from reaching the region. Most of the annual precipitation is received following the northward migration of the ITCZ in May that allows the region to benefit from moist southwesterly flow from the tropical Pacific. The wet season begins with a short period of "early rains" and is interrupted by the mid-summer drought associated with the intensification and westward expansion of the North Atlantic subtropical high in late June. Model projections for the 21st century indicate a lengthening and intensification of the mid-summer drought and a weakening of the early rains on which current crop cultivation practices rely. We developed an expert elicitation to systematically address uncertainties in the available model projections of changes in the seasonal precipitation pattern. Our approach extends an elicitation approach developed previously at Carnegie Mellon University. Experts in the climate of the study region or Central American climate were asked to assess the mechanisms driving precipitation during each part of the season, uncertainties regarding these mechanisms, expected changes in each mechanism in a warming climate, and the capacity of current models to reproduce these processes. To avoid overconfidence bias, a step-by-step procedure was followed to estimate changes in the timing and intensity of precipitation during each part of the season. The questions drew upon interviews conducted with the regions stakeholders to assess their climate information needs. This

  18. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  19. Uncertainties in hydrological extremes projections and its effects on decision-making processes in an Amazonian sub-basin.

    NASA Astrophysics Data System (ADS)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lazaro Siqueira Junior, Jose

    2013-04-01

    Uncertainties in Climate Change projections are affected by irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process. Such uncertainties affect the impact studies, complicating the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. Through these kinds of analyses it is possible to identify critical issues, which must be deeper studied. For this study we used several future's projections from General Circulation Models to feed a Hydrological Model, applied to the Amazonian sub-basin of Ji-Paraná. Hydrological Model integrations are performed for present historical time (1970-1990) and for future period (2010-2100). Extreme values analyses are performed to each simulated time series and results are compared with extremes events in present time. A simple approach to identify potential vulnerabilities consists of evaluating the hydrologic system response to climate variability and extreme events observed in the past, comparing them with the conditions projected for the future. Thus it is possible to identify critical issues that need attention and more detailed studies. For the goal of this work, we used socio-economic data from Brazilian Institute of Geography and Statistics, the Operator of the National Electric System, the Brazilian National Water Agency and scientific and press published information. This information is used to characterize impacts associated to extremes hydrological events in the basin during the present historical time and to evaluate potential impacts in the future face to the different hydrological projections. Results show inter-model variability results in a broad dispersion on projected extreme's values. The impact of such dispersion is differentiated for different aspects of socio-economic and natural systems and must be carefully

  20. Matching experimental and three dimensional numerical models for structural vibration problems with uncertainties

    NASA Astrophysics Data System (ADS)

    Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.

    2018-03-01

    The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.

  1. Harnessing the uncertainty monster: Putting quantitative constraints on the intergenerational social discount rate

    NASA Astrophysics Data System (ADS)

    Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.

    2017-09-01

    There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.

  2. Assessing measurement uncertainty in meteorology in urban environments

    NASA Astrophysics Data System (ADS)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  3. Final Scientific/Technical Report: National Institute for Climatic Change Research Coastal Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tornqvist, Torbjorn; Chambers, Jeffrey

    It is widely recognized that coastal environments are under particular threat due to changes associated with climate change. Accelerated sea-level rise, in some regions augmented by land subsidence, plus the possibility of a changing storm climate, renders low-lying coastal landscapes and their ecosystems vulnerable to future change. This is a pressing problem, because these ecosystems commonly rank as some of the most valuable on the planet. The objective of the NICCR Coastal Center was to support basic research that aims at reducing uncertainty about ecosystem changes during the next century, carried out along the U.S. coastlines. The NICCR Coastal Centermore » has funded 20 projects nationwide (carried out at 27 institutions) that addressed numerous aspects of the problems outlined above. The research has led to a variety of new insights, a significant number of which published in elite scientific journals. It is anticipated that the dissemination of this work in the scientific literature will continue for several more years, given that a number of projects have only recently reached their end date. In addition, NICCR funds have been used to support research at Tulane University. The lion’s share of these funds has been invested in the development of unique facilities for experimental research in coastal ecosystems. This aspect of the work could have a lasting impact in the future.« less

  4. Assessment of SFR Wire Wrap Simulation Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical

  5. Improving uncertainty estimates: Inter-annual variability in Ireland

    NASA Astrophysics Data System (ADS)

    Pullinger, D.; Zhang, M.; Hill, N.; Crutchley, T.

    2017-11-01

    This paper addresses the uncertainty associated with inter-annual variability used within wind resource assessments for Ireland in order to more accurately represent the uncertainties within wind resource and energy yield assessments. The study was undertaken using a total of 16 ground stations (Met Eireann) and corresponding reanalysis datasets to provide an update to previous work on this topic undertaken nearly 20 years ago. The results of the work demonstrate that the previously reported 5.4% of wind speed inter-annual variability is considered to be appropriate, guidance is given on how to provide a robust assessment of IAV using available sources of data including ground stations, MERRA-2 and ERA-Interim.

  6. [The treatment of scientific knowledge in the framework of CITES].

    PubMed

    Lanfranchi, Marie-Pierre

    2014-03-01

    Access to scientific knowledge in the context of CITES is a crucial issue. The effectiveness of the text is indeed largely based on adequate scientific knowledge of CITES species. This is a major challenge: more than 30,000 species and 178 member states are involved. The issue of expertise, however, is not really addressed by the Convention. The question was left to the consideration of the COP. Therefore, the COP has created two ad hoc scientific committees: the Plants Committee and the Animals Committee, conferring upon them an ambitious mandate. The article addresses some important issues at stake which are linked to institutional questions, as well as the mixed record after twenty-five years of practice.

  7. Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness

    NASA Astrophysics Data System (ADS)

    Irias, X.; Cicala, D.

    2013-12-01

    Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is

  8. Scientific Opinion on Risk Assessment of Synthetic Biology.

    PubMed

    Epstein, Michelle M; Vermeire, Theo

    2016-08-01

    In 2013, three Scientific Committees of the European Commission (EC) drafted Scientific Opinions on synthetic biology that provide an operational definition and address risk assessment methodology, safety aspects, environmental risks, knowledge gaps, and research priorities. These Opinions contribute to the international discussions on the risk governance for synthetic biology developments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. The influence of weight-of-evidence strategies on audience perceptions of (un)certainty when media cover contested science.

    PubMed

    Kohl, Patrice Ann; Kim, Soo Yun; Peng, Yilang; Akin, Heather; Koh, Eun Jeong; Howell, Allison; Dunwoody, Sharon

    2016-11-01

    Controversy in science news accounts attracts audiences and draws attention to important science issues. But sometimes covering multiple sides of a science issue does the audience a disservice. Counterbalancing a truth claim backed by strong scientific support with a poorly backed argument can unnecessarily heighten audience perceptions of uncertainty. At the same time, journalistic norms often constrain reporters to "get both sides of the story" even when there is little debate in the scientific community about which truth claim is most valid. In this study, we look at whether highlighting the way in which experts are arrayed across truth claims-a strategy we label "weight-of-evidence reporting"-can attenuate heightened perceptions of uncertainty that can result from coverage of conflicting claims. The results of our study suggest weight-of-evidence strategies can indeed play a role in reducing some of the uncertainty audiences may perceive when encountering lop-sided truth claims. © The Author(s) 2015.

  10. An Ethnomethodological Perspective on How Middle School Students Addressed a Water Quality Problem

    ERIC Educational Resources Information Center

    Belland, Brian R.; Gu, Jiangyue; Kim, Nam Ju; Turner, David J.

    2016-01-01

    Science educators increasingly call for students to address authentic scientific problems in science class. One form of authentic science problem--socioscientific issue--requires that students engage in complex reasoning by considering both scientific and social implications of problems. Computer-based scaffolding can support this process by…

  11. Model averaging techniques for quantifying conceptual model uncertainty.

    PubMed

    Singh, Abhishek; Mishra, Srikanta; Ruskauff, Greg

    2010-01-01

    In recent years a growing understanding has emerged regarding the need to expand the modeling paradigm to include conceptual model uncertainty for groundwater models. Conceptual model uncertainty is typically addressed by formulating alternative model conceptualizations and assessing their relative likelihoods using statistical model averaging approaches. Several model averaging techniques and likelihood measures have been proposed in the recent literature for this purpose with two broad categories--Monte Carlo-based techniques such as Generalized Likelihood Uncertainty Estimation or GLUE (Beven and Binley 1992) and criterion-based techniques that use metrics such as the Bayesian and Kashyap Information Criteria (e.g., the Maximum Likelihood Bayesian Model Averaging or MLBMA approach proposed by Neuman 2003) and Akaike Information Criterion-based model averaging (AICMA) (Poeter and Anderson 2005). These different techniques can often lead to significantly different relative model weights and ranks because of differences in the underlying statistical assumptions about the nature of model uncertainty. This paper provides a comparative assessment of the four model averaging techniques (GLUE, MLBMA with KIC, MLBMA with BIC, and AIC-based model averaging) mentioned above for the purpose of quantifying the impacts of model uncertainty on groundwater model predictions. Pros and cons of each model averaging technique are examined from a practitioner's perspective using two groundwater modeling case studies. Recommendations are provided regarding the use of these techniques in groundwater modeling practice.

  12. [Scientific journals of medical students in Latin-America].

    PubMed

    Cabrera-Samith, Ignacio; Oróstegui-Pinilla, Diana; Angulo-Bazán, Yolanda; Mayta-Tristán, Percy; Rodríguez-Morales, Alfonso J

    2010-11-01

    This article deals with the history and evolution of student's scientific journals in Latin-America, their beginnings, how many still exist and which is their future projection. Relevant events show the growth of student's scientific journals in Latin-America and how are they working together to improve their quality. This article is addressed not only for Latin American readers but also to worldwide readers. Latin American medical students are consistently working together to publish scientific research, whose quality is constantly improving.

  13. MANAGING UNCERTAINTIES ASSOCIATED WITH RADIOACTIVE WASTE DISPOSAL: TASK GROUP 4 OF THE IAEA PRISM PROJECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seitz, R.

    2011-03-02

    It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest inmore » the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.« less

  14. Uncertainty quantification for environmental models

    USGS Publications Warehouse

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    ]. There are also bootstrapping and cross-validation approaches.Sometimes analyses are conducted using surrogate models [12]. The availability of so many options can be confusing. Categorizing methods based on fundamental questions assists in communicating the essential results of uncertainty analyses to stakeholders. Such questions can focus on model adequacy (e.g., How well does the model reproduce observed system characteristics and dynamics?) and sensitivity analysis (e.g., What parameters can be estimated with available data? What observations are important to parameters and predictions? What parameters are important to predictions?), as well as on the uncertainty quantification (e.g., How accurate and precise are the predictions?). The methods can also be classified by the number of model runs required: few (10s to 1000s) or many (10,000s to 1,000,000s). Of the methods listed above, the most computationally frugal are generally those based on local derivatives; MCMC methods tend to be among the most computationally demanding. Surrogate models (emulators)do not necessarily produce computational frugality because many runs of the full model are generally needed to create a meaningful surrogate model. With this categorization, we can, in general, address all the fundamental questions mentioned above using either computationally frugal or demanding methods. Model development and analysis can thus be conducted consistently using either computation-ally frugal or demanding methods; alternatively, different fundamental questions can be addressed using methods that require different levels of effort. Based on this perspective, we pose the question: Can computationally frugal methods be useful companions to computationally demanding meth-ods? The reliability of computationally frugal methods generally depends on the model being reasonably linear, which usually means smooth nonlin-earities and the assumption of Gaussian errors; both tend to be more valid with more linear

  15. 75 FR 51239 - Census Scientific Advisory Committee; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-19

    ... Advisory Committee (C-SAC). The Committee will address policy, research, and technical issues relating to a.... The Committee provides scientific and technical expertise, as appropriate, to address Census Bureau...

  16. Psychological defense, ideological hideaway, or rational reckoning? The role of uncertainty in local adaptation to climate change

    NASA Astrophysics Data System (ADS)

    Moser, S. C.

    2011-12-01

    As adaptation planning is rising rapidly on the agenda of decision-makers, the need for adequate information to inform those decisions is growing. Locally relevant climate change (as well as related impacts and vulnerability) information, however, is difficult to obtain and that which can be obtained carries the burden of significant scientific uncertainty. This paper aims to assess how important such uncertainty is in adaptation planning, decision-making, and related stakeholder engagement. Does uncertainty actually hinder adaptation planning? Is scientific uncertainty used to postpone decisions reflecting ideologically agendas? Or is it a convenient defense against cognitive and affective engagement with the emerging and projected - and in some cases daunting - climate change risks? To whom does such uncertainty matter and how important is it relative to other challenges decision-makers and stakeholders face? The paper draws on four sources of information to answer these questions: (1) a statewide survey of California coastal managers conducted in summer 2011, (2) years of continual engagement with, and observation of, decision-makers in local adaptation efforts, (3) findings from focus groups with lay individuals in coastal California; and (4) a review of relevant adaptation literature to guide and contextualize the empirical research. The findings entail some "inconvenient truths" for those claiming critical technical or political importance. Rather, the insights suggest that some uncertainties matter more than others; they matter at certain times, but not at others; and they matter to some decision-makers, but not to others. Implications for scientists communicating and engaging with communities are discussed.

  17. Quantifying uncertainty in read-across assessment – an algorithmic approach - (SOT)

    EPA Science Inventory

    Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...

  18. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  19. Scientific Reporting: Raising the Standards

    ERIC Educational Resources Information Center

    McLeroy, Kenneth R.; Garney, Whitney; Mayo-Wilson, Evan; Grant, Sean

    2016-01-01

    This article is based on a presentation that was made at the 2014 annual meeting of the editorial board of "Health Education & Behavior." The article addresses critical issues related to standards of scientific reporting in journals, including concerns about external and internal validity and reporting bias. It reviews current…

  20. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.

  1. The fact of uncertainty, the uncertainty of facts and the cultural resonance of doubt.

    PubMed

    Oreskes, Naomi

    2015-11-28

    Sixty years after industry executives first decided to fight the facts of tobacco, the exploitation of doubt and uncertainty as a defensive tactic has spread to a diverse set of industries and issues with an interest in challenging scientific evidence. However, one can find examples of doubt-mongering before tobacco. One involves the early history of electricity generation in the USA. In the 1920s, the American National Electric Light Association ran a major propaganda campaign against public sector electricity generation, focused on the insistence that privately generated electricity was cheaper and that public power generation was socialistic and therefore un-American. This campaign included advertisements, editorials (generally ghost-written), the rewriting of textbooks and the development of high school and college curricula designed to cast doubt on the cost-effectiveness of public electricity generation and extol the virtues of laissez-faire capitalism. It worked in large part by finding, cultivating and paying experts to endorse the industry's claims in the mass media and the public debate, and to legitimatize the alterations to textbooks and curricula. The similarities between the electric industry strategy and the defence of tobacco, lead paint and fossil fuels suggests that these strategies work for reasons that are not specific to the particular technical claims under consideration. This paper argues that a reason for the cultural persistence of doubt is what we may label the 'fact of uncertainty'. Uncertainty is intrinsic to science, and this creates vulnerabilities that interested parties may, and commonly do, exploit, both by attempting to challenge the specific conclusions of technical experts and by implying that those conclusions threaten other social values. © 2015 The Author(s).

  2. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    NASA Astrophysics Data System (ADS)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  3. Uncertainty and psychological adjustment in patients with lung cancer

    PubMed Central

    Kurita, Keiko; Garon, Edward B.; Stanton, Annette L.; Meyerowitz, Beth E.

    2014-01-01

    Background For many patients with lung cancer, disease progression occurs without notice or with vague symptoms, and unfortunately, most treatments are not curative. Given this unpredictability, we hypothesized the following: (1) poorer psychological adjustment (specifically, more depressive symptoms, higher perceptions of stress, and poorer emotional well-being) would be associated with higher intolerance for uncertainty, higher perceived illness-related ambiguity, and their interaction; and (2) greater avoidance would mediate associations between higher intolerance of uncertainty and poorer psychological adjustment. Methods Participants (N = 49) diagnosed with lung cancer at least 6 months prior to enrollment completed the Center for Epidemiologic Studies – Depression Scale, the Functional Assessment of Cancer Therapy – Lung Emotional Well-being subscale, the Perceived Stress scale, the Intolerance of Uncertainty scale, the Mishel Uncertainty in Illness Scale Ambiguity subscale, the Impact of Event – Revised Avoidance subscale, and the Short-scale Eysenck Personality Questionnaire – Revised Neuroticism subscale. Mean age was 64.2 years (standard deviation [SD] = 11.0), mean years of education was 15.6 (SD = 3.1), and 71.4% were female. Hypotheses were tested with regression analyses, adjusted for neuroticism. Results Higher perceptions of stress and poorer emotional well-being were associated with higher levels of intolerance of uncertainty and higher perceived illness-related ambiguity. Non-somatic depressive symptoms were associated with higher levels of intolerance of uncertainty. Avoidance was found to mediate relations of intolerance of uncertainty with non-somatic depressive symptoms and emotional well-being only. Conclusions Findings suggest that interventions to address avoidance and intolerance of uncertainty in individuals with lung cancer may help improve psychological adjustment. PMID:22887017

  4. Decision analysis of shoreline protection under climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  5. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; de Moel, H.

    2016-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage functions and maximum damages can have large effects on flood damage estimates. This explanation is then used to quantify the uncertainty in the damage estimates with a Monte Carlo analysis. The Monte Carlo analysis uses a damage function library with 272 functions from seven different flood damage models. The paper shows that the resulting uncertainties in estimated damages are in the order of magnitude of a factor of 2 to 5. The uncertainty is typically larger for flood events with small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  6. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  7. Professionalism, scientific freedom and dissent: individual and institutional roles and responsibilities in geoethics

    NASA Astrophysics Data System (ADS)

    Bilham, Nic

    2015-04-01

    Debate and dissent are at the heart of scientific endeavour. A diversity of perspectives, alternative interpretations of evidence and the robust defence of competing theories and models drive the advancement of scientific knowledge. Just as importantly, legitimate dissent and diversity of views should not be covered up when offering scientific advice to policy-makers and providing evidence to inform public debate - indeed, they should be valued. We should offer what Andy Stirling has termed 'plural and conditional' scientific advice, not just for the sake of democratic legitimacy, but because it supports better informed and more effective policy-making. 'Monocultures' of scientific advice may have a superficial appeal to policy-makers, but they devalue the contribution of scientists, undermine the resilience of regulatory structures, are often misleading, and can lead to catastrophic policy failure. Furthermore, many of the great societal challenges now facing us require interdisciplinary approaches, across the natural sciences and more widely still, which bring to the fore the need for humility, recognition that we do not have all the answers, and mutual respect for the views of others. In contentious areas such as climate change, extraction of shale gas and radioactive waste disposal, however, such open dialogue may make researchers and practitioners vulnerable to advocates and campaigners who cherry-pick the evidence, misinterpret it, or seek to present scientific uncertainty and debate as mere ignorance. Nor are scientists themselves always above such unethical tactics. The apparent authority conferred on unscrupulous 'campaigning scientists' by their academic and professional credentials may make it all but impossible to distinguish them from those who legitimately make the case for a minority scientific view (and may be marginalised by the mainstream of their discipline in doing so). There is a risk that real scientific debate may be thwarted. Individual

  8. End of life care interventions for people with dementia in care homes: addressing uncertainty within a framework for service delivery and evaluation.

    PubMed

    Goodman, Claire; Froggatt, Katherine; Amador, Sarah; Mathie, Elspeth; Mayrhofer, Andrea

    2015-09-17

    There has been an increase in research on improving end of life (EoL) care for older people with dementia in care homes. Findings consistently demonstrate improvements in practitioner confidence and knowledge, but comparisons are either with usual care or not made. This paper draws on findings from three studies to develop a framework for understanding the essential dimensions of end of life care delivery in long-term care settings for people with dementia. The data from three studies on EoL care in care homes: (i) EVIDEM EoL, (ii) EPOCH, and (iii) TTT EoL were used to inform the development of the framework. All used mixed method designs and two had an intervention designed to improve how care home staff provided end of life care. The EVIDEM EoL and EPOCH studies tracked the care of older people in care homes over a period of 12 months. The TTT study collected resource use data of care home residents for three months, and surveyed decedents' notes for ten months, Across the three studies, 29 care homes, 528 residents, 205 care home staff, and 44 visiting health care professionals participated. Analysis of showed that end of life interventions for people with dementia were characterised by uncertainty in three key areas; what treatment is the 'right' treatment, who should do what and when, and in which setting EoL care should be delivered and by whom? These uncertainties are conceptualised as Treatment uncertainty, Relational uncertainty and Service uncertainty. This paper proposes an emergent framework to inform the development and evaluation of EoL care interventions in care homes. For people with dementia living and dying in care homes, EoL interventions need to provide strategies that can accommodate or "hold" the inevitable and often unresolvable uncertainties of providing and receiving care in these settings.

  9. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  10. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  11. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  12. 75 FR 53325 - Proposed Scientific Integrity Policy of the Department of the Interior

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... September 20, 2010. ADDRESSES: Send comments to: [email protected]ios.doi.gov . FOR FURTHER INFORMATION... scientific products, or on documents compiled and translated from scientific products, to ensure that agency... involving inventorying, monitoring, experimentation, study, research, modeling, and scientific assessment...

  13. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  14. Uncertainties in modelling the climate impact of irrigation

    NASA Astrophysics Data System (ADS)

    de Vrese, Philipp; Hagemann, Stefan

    2017-11-01

    Irrigation-based agriculture constitutes an essential factor for food security as well as fresh water resources and has a distinct impact on regional and global climate. Many issues related to irrigation's climate impact are addressed in studies that apply a wide range of models. These involve substantial uncertainties related to differences in the model's structure and its parametrizations on the one hand and the need for simplifying assumptions for the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In order to estimate possible uncertainties with regard to the model's more general structure, we compared the climate impact of irrigation between three simulations that use different schemes for the land-surface-atmosphere coupling. Here, it can be shown that the choice of coupling scheme does not only affect the magnitude of possible impacts but even their direction. For example, when using a scheme that does not explicitly resolve spatial subgrid scale heterogeneity at the surface, irrigation reduces the atmospheric water content, even in heavily irrigated regions. Contrarily, in simulations that use a coupling scheme that resolves heterogeneity at the surface or even within the lowest layers of the atmosphere, irrigation increases the average atmospheric specific humidity. A second experiment targeted possible uncertainties related to the representation of irrigation characteristics. Here, in four simulations the irrigation effectiveness (controlled by the target soil moisture and the non-vegetated fraction of the grid box that receives irrigation) and the timing of delivery were varied. The second experiment shows that uncertainties related to the modelled irrigation characteristics, especially the irrigation effectiveness, are also substantial. In general the impact of irrigation on the state of the land

  15. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  16. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  17. THE ROLE OF RISK ASSESSMENT IN ADDRESSING HAZARDOUS WASTE ISSUES

    EPA Science Inventory

    Risk assessment plays many important roles in addressing hazardous waste issues. In addition to providing a scientific framework and common health metric to evaluate risks. Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA or "Superfund") risk assessm...

  18. Using high-throughput literature mining to support read-across predictions of toxicity (SOT)

    EPA Science Inventory

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...

  19. High-throughput literature mining to support read-across predictions of toxicity (ASCCT meeting)

    EPA Science Inventory

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...

  20. Analogy and Intersubjectivity: Political Oratory, Scholarly Argument and Scientific Reports.

    ERIC Educational Resources Information Center

    Gross, Alan G.

    1983-01-01

    Focuses on the different ways political oratory, scholarly argument, and scientific reports use analogy. Specifically, analyzes intersubjective agreement in Franklin D. Roosevelt's First Inaugural address, the scholarly argument between Sir Karl Popper and Thomas S. Kuhn, and the scientific reports of various mathematicians and scientists. (PD)

  1. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the

  2. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  3. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-05-23

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities ; International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) ;Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis;Workshop Assistant: Geraldine Jean

  4. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Garbil, Roger

    2018-04-16

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden). Workshop Organizing Committee: Enrico Chiaveri (Chairman); Marco Calviani; Samuel Andriamonje; Eric Berthoumieux; Carlos Guerrero; Roberto Losito; Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean

  5. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-05-24

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) & Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean

  6. A Physics MOSAIC: Scientific Skills and Explorations for Students

    NASA Astrophysics Data System (ADS)

    May, S.; Clements, C.; Erickson, P. J.; Rogers, A.

    2010-12-01

    A 21st century education needs to teach students how to manage information in an ever more digital age. High school students (like all of us) are inundated with information, and informed citizenship increasingly depends on the ability to be a critical consumer of data. In the scientific community, experimental data from remote, high quality systems are becoming increasingly available in real time. The same networks providing data also allow scientists to use the ubiquity of internet access to enlist citizen scientists to help with research. As a means of addressing and leveraging these trends, we describe a classroom unit developed as part of the NSF Research Experience for Teachers (RET) program at MIT Haystack Observatory in the summer of 2010. The unit uses accessible, real-time science data to teach high school physics students about the nature and process of scientific research, with the goal of teaching how to be an informed citizen, regardless of eventual vocation. The opportunity to study the atmosphere provides increased engagement in the classroom, and students have an authentic experience of asking and answering scientific questions when the answer cannot simply be found on the Web. MOSAIC (Mesospheric Ozone System for Atmospheric Investigations in the Classroom) is a relatively inexpensive tool for measuring mesospheric ozone by taking advantage of the sensitivity of commercially produced satellite TV dishes to the 11.072545 GHz rotational transition of ozone. Because the signal from ozone in the lower atmosphere is pressure-broadened, the system is able to isolate the signal from the 1% of Earth’s ozone that comes from the mesosphere. Our teaching unit takes advantage of measurements collected since 2008 from six East Coast observing sites at high schools and colleges. Data are available online within a day of their collection, and an easy to use web interface allows students to track mesospheric ozone in frequency, time of day, or day of year. The

  7. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  8. Cued uncertainty modulates later recognition of emotional pictures: An ERP study.

    PubMed

    Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua

    2017-06-01

    Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.

  9. Mythical thinking, scientific discourses and research dissemination.

    PubMed

    Hroar Klempe, Sven

    2011-06-01

    This article focuses on some principles for understanding. By taking Anna Mikulak's article "Mismatches between 'scientific' and 'non-scientific' ways of knowing and their contributions to public understanding of science" (IPBS 2011) as a point of departure, the idea of demarcation criteria for scientific and non-scientific discourses is addressed. Yet this is juxtaposed with mythical thinking, which is supposed to be the most salient trait of non-scientific discourses. The author demonstrates how the most widespread demarcation criterion, the criterion of verification, is self-contradictory, not only when it comes to logic, but also in the achievement of isolating natural sciences from other forms of knowledge. According to Aristotle induction is a rhetorical device and as far as scientific statements are based on inductive inferences, they are relying on humanities, which rhetoric is a part of. Yet induction also has an empirical component by being based on sense-impressions, which is not a part of the rhetoric, but the psychology. Also the myths are understood in a rhetorical (Lévi-Strauss) and a psychological (Cassirer) perspective. Thus it is argued that both scientific and non-scientific discourses can be mythical.

  10. A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.

    Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less

  11. Scientific Integrity and Consensus in the Intergovernmental Panel on Climate Change Assessment Process

    NASA Astrophysics Data System (ADS)

    Barrett, K.

    2017-12-01

    Scientific integrity is the hallmark of any assessment and is a paramount consideration in the Intergovernmental Panel on Climate Change (IPCC) assessment process. Procedures are in place for rigorous scientific review and to quantify confidence levels and uncertainty in the communication of key findings. However, the IPCC is unique in that its reports are formally accepted by governments through consensus agreement. This presentation will present the unique requirements of the IPCC intergovernmental assessment and discuss the advantages and challenges of its approach.

  12. Space and radiation protection: scientific requirements for space research

    NASA Technical Reports Server (NTRS)

    Schimmerling, W.

    1995-01-01

    Ionizing radiation poses a significant risk to humans living and working in space. The major sources of radiation are solar disturbances and galactic cosmic rays. The components of this radiation are energetic charged particles, protons, as well as fully ionized nuclei of all elements. The biological effects of these particles cannot be extrapolated in a straightforward manner from available data on x-rays and gamma-rays. A radiation protection program that meets the needs of spacefaring nations must have a solid scientific basis, capable not only of predicting biological effects, but also of making reliable estimates of the uncertainty in these predictions. A strategy leading to such predictions is proposed, and scientific requirements arising from this strategy are discussed.

  13. 76 FR 4705 - Tobacco Products Scientific Advisory Committee; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ...] Tobacco Products Scientific Advisory Committee; Notice of Meeting AGENCY: Food and Drug Administration.... Name of Committee: Tobacco Products Scientific Advisory Committee. General Function of the Committee... nature of the evidence or arguments they wish to present, the names and addresses of proposed...

  14. 75 FR 56547 - Tobacco Products Scientific Advisory Committee; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-16

    ...] Tobacco Products Scientific Advisory Committee; Notice of Meeting AGENCY: Food and Drug Administration... Products Scientific Advisory Committee. General Function of the Committee: To provide advice and... evidence or arguments they wish to present, the names and addresses of proposed participants, and an...

  15. Uncertainty in prostate cancer. Ethnic and family patterns.

    PubMed

    Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J

    1998-01-01

    Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their

  16. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    NASA Technical Reports Server (NTRS)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  17. Using cost-benefit concepts in design floods improves communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi

    2017-04-01

    Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs

  18. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  19. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  20. Quantifying allometric model uncertainty for plot-level live tree biomass stocks with a data-driven, hierarchical framework

    Treesearch

    Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall

    2016-01-01

    Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United States’ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...

  1. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Dyk, J; Palta, J; Bortfeld, T

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “howmore » do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.« less

  2. Scientific Assistant Virtual Laboratory (SAVL)

    NASA Astrophysics Data System (ADS)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  3. Resource Materials on Scientific Integrity Issues.

    ERIC Educational Resources Information Center

    Macrina, Francis L., Ed.; Munro, Cindy L., Ed.

    1993-01-01

    The annotated bibliography contains 26 citations of books, monographs, and articles that may be useful to faculty and students in courses on scientific integrity. Topics addressed include ethical and legal considerations, fraud, technical writing and publication, intellectual property, notetaking, case study approach, conflict of interest, and…

  4. Approaches for describing and communicating overall uncertainty in toxicity characterizations: U.S. Environmental Protection Agency's Integrated Risk Information System (IRIS) as a case study.

    PubMed

    Beck, Nancy B; Becker, Richard A; Erraguntla, Neeraja; Farland, William H; Grant, Roberta L; Gray, George; Kirman, Christopher; LaKind, Judy S; Jeffrey Lewis, R; Nance, Patricia; Pottenger, Lynn H; Santos, Susan L; Shirley, Stephanie; Simon, Ted; Dourson, Michael L

    2016-01-01

    Single point estimates of human health hazard/toxicity values such as a reference dose (RfD) are generally used in chemical hazard and risk assessment programs for assessing potential risks associated with site- or use-specific exposures. The resulting point estimates are often used by risk managers for regulatory decision-making, including standard setting, determination of emission controls, and mitigation of exposures to chemical substances. Risk managers, as well as stakeholders (interested and affected parties), often have limited information regarding assumptions and uncertainty factors in numerical estimates of both hazards and risks. Further, the use of different approaches for addressing uncertainty, which vary in transparency, can lead to a lack of confidence in the scientific underpinning of regulatory decision-making. The overarching goal of this paper, which was developed from an invited participant workshop, is to offer five approaches for presenting toxicity values in a transparent manner in order to improve the understanding, consideration, and informed use of uncertainty by risk assessors, risk managers, and stakeholders. The five approaches for improving the presentation and communication of uncertainty are described using U.S. Environmental Protection Agency's (EPA's) Integrated Risk Information System (IRIS) as a case study. These approaches will ensure transparency in the documentation, development, and use of toxicity values at EPA, the Agency for Toxic Substances and Disease Registry (ATSDR), and other similar assessment programs in the public and private sector. Further empirical testing will help to inform the approaches that will work best for specific audiences and situations. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals

    PubMed Central

    Severtson, Dolores J.

    2015-01-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings. PMID:26412960

  6. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals.

    PubMed

    Severtson, Dolores J

    2015-02-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings.

  7. "Maybe the Algae Was from the Filter": Maybe and Similar Modifiers as Mediational Tools and Indicators of Uncertainty and Possibility in Children's Science Talk

    ERIC Educational Resources Information Center

    Kirch, Susan A.; Siry, Christina A.

    2012-01-01

    Uncertainty is an essential component of scientific inquiry and it also permeates our daily lives. Understanding how to identify, evaluate, resolve and live in the presence of uncertainty is important for decision-making strategies and engaging in transformative actions. In contrast, confidence and certainty are prized in elementary school…

  8. Uncertainty and extreme events in future climate and hydrologic projections for the Pacific Northwest: providing a basis for vulnerability and core/corridor assessments

    USGS Publications Warehouse

    Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.

    2014-01-01

    The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate

  9. Uncertainty analysis in ecological studies: an overview

    Treesearch

    Harbin Li; Jianguo Wu

    2006-01-01

    Large-scale simulation models are essential tools for scientific research and environmental decision-making because they can be used to synthesize knowledge, predict consequences of potential scenarios, and develop optimal solutions (Clark et al. 2001, Berk et al. 2002, Katz 2002). Modeling is often the only means of addressing complex environmental problems that occur...

  10. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  11. The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-11-01

    NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.

  12. Quantifying Groundwater Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E.; Foglia, L.

    2007-12-01

    Groundwater models are characterized by the (a) processes simulated, (b) boundary conditions, (c) initial conditions, (d) method of solving the equation, (e) parameterization, and (f) parameter values. Models are related to the system of concern using data, some of which form the basis of observations used most directly, through objective functions, to estimate parameter values. Here we consider situations in which parameter values are determined by minimizing an objective function. Other methods of model development are not considered because their ad hoc nature generally prohibits clear quantification of uncertainty. Quantifying prediction uncertainty ideally includes contributions from (a) to (f). The parameter values of (f) tend to be continuous with respect to both the simulated equivalents of the observations and the predictions, while many aspects of (a) through (e) are discrete. This fundamental difference means that there are options for evaluating the uncertainty related to parameter values that generally do not exist for other aspects of a model. While the methods available for (a) to (e) can be used for the parameter values (f), the inferential methods uniquely available for (f) generally are less computationally intensive and often can be used to considerable advantage. However, inferential approaches require calculation of sensitivities. Whether the numerical accuracy and stability of the model solution required for accurate sensitivities is more broadly important to other model uses is an issue that needs to be addressed. Alternative global methods can require 100 or even 1,000 times the number of runs needed by inferential methods, though methods of reducing the number of needed runs are being developed and tested. Here we present three approaches for quantifying model uncertainty and investigate their strengths and weaknesses. (1) Represent more aspects as parameters so that the computationally efficient methods can be broadly applied. This

  13. Propagating uncertainty from hydrology into human health risk assessment

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2013-12-01

    Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.

  14. Beyond dose assessment: using risk with full disclosure of uncertainty in public and scientific communication.

    PubMed

    Hoffman, F Owen; Kocher, David C; Apostoaei, A Iulian

    2011-11-01

    Evaluations of radiation exposures of workers and the public traditionally focus on assessments of radiation dose, especially annual dose, without explicitly evaluating the health risk associated with those exposures, principally the risk of radiation-induced cancer. When dose is the endpoint of an assessment, opportunities to communicate the significance of exposures are limited to comparisons with dose criteria in regulations, doses due to natural background or medical x-rays, and doses above which a statistically significant increase of disease has been observed in epidemiologic studies. Risk assessment generally addresses the chance (probability) that specific diseases might be induced by past, present, or future exposure. The risk of cancer per unit dose will vary depending on gender, age, exposure type (acute or chronic), and radiation type. It is not uncommon to find that two individuals with the same effective dose will have substantially different risks. Risk assessment has shown, for example, that: (a) medical exposures to computed tomography scans have become a leading source of future risk to the general population, and that the risk would be increased above recently published estimates if the incidence of skin cancer and the increased risk from exposure to x-rays compared with high-energy photons were taken into account; (b) indoor radon is a significant contributor to the baseline risk of lung cancer, particularly among people who have never smoked; and (c) members of the public who were exposed in childhood to I in fallout from atmospheric nuclear weapons tests and were diagnosed with thyroid cancer later in life would frequently meet criteria established for federal compensation of cancers experienced by energy workers and military participants at atmospheric weapons tests. Risk estimation also enables comparisons of impacts of exposures to radiation and chemical carcinogens and other hazards to life and health. Communication of risk with

  15. PARTNERING TO IMPROVE HUMAN EXPOSURE METHODS

    EPA Science Inventory

    Methods development research is an application-driven scientific area that addresses programmatic needs. The goals are to reduce measurement uncertainties, address data gaps, and improve existing analytical procedures for estimating human exposures. Partnerships have been develop...

  16. On the role of model-based monitoring for adaptive planning under uncertainty

    NASA Astrophysics Data System (ADS)

    Raso, Luciano; Kwakkel, Jan; Timmermans, Jos; Haasnoot, Mariolijn

    2016-04-01

    Adaptive plans, designed to anticipate and respond to an unfolding uncertain future, have found a fertile application domain in the planning of deltas that are exposed to rapid socioeconomic development and climate change. Adaptive planning, under the moniker of adaptive delta management, is used in the Dutch Delta Program for developing a nation-wide plan to prepare for uncertain climate change and socio-economic developments. Scientifically, adaptive delta management relies heavily on Dynamic Adaptive Policy Pathways. Currently, in the Netherlands the focus is shifting towards implementing the adaptive delta plan. This shift is especially relevant because the efficacy of adaptive plans hinges on monitoring on-going developments and ensuring that actions are indeed taken if and when necessary. In the design of an effective monitoring system for an adaptive plan, three challenges have to be confronted: • Shadow of the past: The development of adaptive plans and the design of their monitoring system relies heavily on current knowledge of the system, and current beliefs about plausible future developments. A static monitoring system is therefore exposed to the exact same uncertainties one tries to address through adaptive planning. • Inhibition of learning: Recent applications of adaptive planning tend to overlook the importance of learning and new information, and fail to account for this explicitly in the design of adaptive plans. • Challenge of surprise: Adaptive policies are designed in light of the current foreseen uncertainties. However, developments that are not considered during the design phase as being plausible could still substantially affect the performance of adaptive policies. The shadow of the past, the inhibition of learning, and the challenge of surprise taken together suggest that there is a need for redesigning the concepts of monitoring and evaluation to support the implementation of adaptive plans. Innovations from control theory

  17. Poor reporting of scientific leadership information in clinical trial registers.

    PubMed

    Sekeres, Melanie; Gold, Jennifer L; Chan, An-Wen; Lexchin, Joel; Moher, David; Van Laethem, Marleen L P; Maskalyk, James; Ferris, Lorraine; Taback, Nathan; Rochon, Paula A

    2008-02-20

    In September 2004, the International Committee of Medical Journal Editors (ICMJE) issued a Statement requiring that all clinical trials be registered at inception in a public register in order to be considered for publication. The World Health Organization (WHO) and ICMJE have identified 20 items that should be provided before a trial is considered registered, including contact information. Identifying those scientifically responsible for trial conduct increases accountability. The objective is to examine the proportion of registered clinical trials providing valid scientific leadership information. We reviewed clinical trial entries listing Canadian investigators in the two largest international and public trial registers, the International Standard Randomized Controlled Trial Number (ISRCTN) register, and ClinicalTrials.gov. The main outcome measures were the proportion of clinical trials reporting valid contact information for the trials' Principal Investigator (PI)/Co-ordinating Investigator/Study Chair/Site PI, and trial e-mail contact address, stratified by funding source, recruiting status, and register. A total of 1388 entries (142 from ISRCTN and 1246 from ClinicalTrials.gov) comprised our sample. We found non-compliance with mandatory registration requirements regarding scientific leadership and trial contact information. Non-industry and partial industry funded trials were significantly more likely to identify the individual responsible for scientific leadership (OR = 259, 95% CI: 95-701) and to provide a contact e-mail address (OR = 9.6, 95% CI: 6.6-14) than were solely industry funded trials. Despite the requirements set by WHO and ICMJE, data on scientific leadership and contact e-mail addresses are frequently omitted from clinical trials registered in the two leading public clinical trial registers. To promote accountability and transparency in clinical trials research, public clinical trials registers should ensure adequate monitoring of trial

  18. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-06-20

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation, Cross section measurements, Experimental techniques, Uncertainties and covariances, Fission properties, and Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France), T. Belgya (IKI KFKI, Hungary), E. Gonzalez (CIEMAT, Spain), F. Gunsing (CEA, France), F.-J. Hambsch (IRMM, Belgium), A. Junghans (FZD, Germany), R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman), Marco Calviani, Samuel Andriamonje, Eric Berthoumieux, Carlos Guerrero, Roberto Losito, Vasilis Vlachoudis. Workshop Assistant: Geraldine Jean

  19. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Lantz, Mattias; Neudecker, Denise

    2018-05-25

    Part 5 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean

  20. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Wilson, J.N.

    2018-05-24

    Part 7 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities;International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.

  1. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbil, Roger

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden). Workshop Organizing Committee: Enrico Chiaveri (Chairman); Marco Calviani; Samuel Andriamonje; Eric Berthoumieux; Carlos Guerrero; Robertomore » Losito; Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean« less

  2. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, J.N.

    2010-11-09

    Part 7 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities;International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calvianimore » Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.« less

  3. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lantz, Mattias; Neudecker, Denise

    2010-11-09

    Part 5 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuelmore » Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean« less

  4. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlachoudis, Vasilis

    2010-11-09

    Part 8. The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu Topics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Ericmore » Berthoumieux Carlos Guerrero Roberto LositoVasilis Vlachoudis Workshop Assistant: Geraldine Jean« less

  5. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) & Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuelmore » Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean« less

  6. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities ; International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) ;Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Lositomore » Vasilis Vlachoudis;Workshop Assistant: Geraldine Jean« less

  7. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2017-12-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  8. From Comparison Between Scientists to Gaining Cultural Scientific Knowledge. Leonardo and Galileo

    NASA Astrophysics Data System (ADS)

    Galili, Igal

    2016-03-01

    Physics textbooks often present items of disciplinary knowledge in a sequential order of topics of the theory under instruction. Such presentation is usually univocal, that is, isolated from alternative claims and contributions regarding the subject matter in the pertinent scientific discourse. We argue that comparing and contrasting the contributions of scientists addressing similar or the same subject could not only enrich the picture of scientific enterprise, but also possess a special appealing power promoting genuine understanding of the concept considered. This approach draws on the historical tradition from Plutarch in distant past and Koyré in the recent history and philosophy of science. It gains a new support in the discipline-culture structuring of the physics curriculum, seeking cultural content knowledge (CCK) of the subject matter. Here, we address two prominent individuals of Italian Renaissance, Leonardo and Galileo, in their dealing with issues relevant for introductory science courses. Although both figures addressed similar subjects of scientific content, their products were essentially different. Considering this difference is educationally valuable, illustrating the meaning of what students presently learn in the content knowledge of mechanics, optics and astronomy, as well as the nature of science and scientific knowledge.

  9. Social behavioural epistemology and the scientific community.

    PubMed

    Watve, Milind

    2017-07-01

    The progress of science is influenced substantially by social behaviour of and social interactions within the scientific community. Similar to innovations in primate groups, the social acceptance of an innovation depends not only upon the relevance of the innovation but also on the social dominance and connectedness of the innovator. There are a number of parallels between many well-known phenomena in behavioural evolution and various behavioural traits observed in the scientific community. It would be useful, therefore, to use principles of behavioural evolution as hypotheses to study the social behaviour of the scientific community. I argue in this paper that a systematic study of social behavioural epistemology is likely to boost the progress of science by addressing several prevalent biases and other problems in scientific communication and by facilitating appropriate acceptance/rejection of novel concepts.

  10. Assessing what to address in science communication.

    PubMed

    Bruine de Bruin, Wändi; Bostrom, Ann

    2013-08-20

    As members of a democratic society, individuals face complex decisions about whether to support climate change mitigation, vaccinations, genetically modified food, nanotechnology, geoengineering, and so on. To inform people's decisions and public debate, scientific experts at government agencies, nongovernmental organizations, and other organizations aim to provide understandable and scientifically accurate communication materials. Such communications aim to improve people's understanding of the decision-relevant issues, and if needed, promote behavior change. Unfortunately, existing communications sometimes fail when scientific experts lack information about what people need to know to make more informed decisions or what wording people use to describe relevant concepts. We provide an introduction for scientific experts about how to use mental models research with intended audience members to inform their communication efforts. Specifically, we describe how to conduct interviews to characterize people's decision-relevant beliefs or mental models of the topic under consideration, identify gaps and misconceptions in their knowledge, and reveal their preferred wording. We also describe methods for designing follow-up surveys with larger samples to examine the prevalence of beliefs as well as the relationships of beliefs with behaviors. Finally, we discuss how findings from these interviews and surveys can be used to design communications that effectively address gaps and misconceptions in people's mental models in wording that they understand. We present applications to different scientific domains, showing that this approach leads to communications that improve recipients' understanding and ability to make informed decisions.

  11. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    PubMed

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  12. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  13. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    PubMed

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  14. 78 FR 2370 - New England Fishery Management Council (NEFMC); Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... p.m. to address employment matters. Tuesday, January 29, 2013 Following introductions and any... catch based on Scientific and Statistical Committee advice, management uncertainty, optimum yield and a...'s Scientific and Statistical Committee will report on its acceptable biological catch...

  15. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    NASA Astrophysics Data System (ADS)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  16. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    DOE PAGES

    Di Vittorio, A. V.; Mao, J.; Shi, X.; ...

    2018-01-03

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less

  17. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Vittorio, A. V.; Mao, J.; Shi, X.

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. In this paper, we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO 2 in 2004, and generates carbon uncertainty that is equivalentmore » to 80% of the net effects of CO 2 and climate and 124% of the effects of nitrogen deposition during 1850–2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. Finally, we conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.« less

  18. Advances in Scientific Investigation and Automation.

    ERIC Educational Resources Information Center

    Abt, Jeffrey; And Others

    1987-01-01

    Six articles address: (1) the impact of science on the physical examination and treatment of books; (2) equipment for physical examination of books; (3) research using the cyclotron for historical analysis; (4) scientific analysis of paper and ink in early maps; (5) recent advances in automation; and (6) cataloging standards. (MES)

  19. Quantifying the uncertainties in life cycle greenhouse gas emissions for UK wheat ethanol

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyu; Boies, Adam M.

    2013-03-01

    Biofuels are increasingly promoted worldwide as a means for reducing greenhouse gas (GHG) emissions from transport. However, current regulatory frameworks and most academic life cycle analyses adopt a deterministic approach in determining the GHG intensities of biofuels and thus ignore the inherent risk associated with biofuel production. This study aims to develop a transparent stochastic method for evaluating UK biofuels that determines both the magnitude and uncertainty of GHG intensity on the basis of current industry practices. Using wheat ethanol as a case study, we show that the GHG intensity could span a range of 40-110 gCO2e MJ-1 when land use change (LUC) emissions and various sources of uncertainty are taken into account, as compared with a regulatory default value of 44 gCO2e MJ-1. This suggests that the current deterministic regulatory framework underestimates wheat ethanol GHG intensity and thus may not be effective in evaluating transport fuels. Uncertainties in determining the GHG intensity of UK wheat ethanol include limitations of available data at a localized scale, and significant scientific uncertainty of parameters such as soil N2O and LUC emissions. Biofuel polices should be robust enough to incorporate the currently irreducible uncertainties and flexible enough to be readily revised when better science is available.

  20. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    USGS Publications Warehouse

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  1. Teaching Argumentation and Scientific Discourse Using the Ribosomal Peptidyl Transferase Reaction

    ERIC Educational Resources Information Center

    Johnson, R. Jeremy

    2011-01-01

    Argumentation and discourse are two integral parts of scientific investigation that are often overlooked in undergraduate science education. To address this limitation, the story of peptide bond formation by the ribosome can be used to illustrate the importance of evidence, claims, arguments, and counterarguments in scientific discourse. With the…

  2. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  3. Controlling quantum memory-assisted entropic uncertainty in non-Markovian environments

    NASA Astrophysics Data System (ADS)

    Zhang, Yanliang; Fang, Maofa; Kang, Guodong; Zhou, Qingping

    2018-03-01

    Quantum memory-assisted entropic uncertainty relation (QMA EUR) addresses that the lower bound of Maassen and Uffink's entropic uncertainty relation (without quantum memory) can be broken. In this paper, we investigated the dynamical features of QMA EUR in the Markovian and non-Markovian dissipative environments. It is found that dynamical process of QMA EUR is oscillation in non-Markovian environment, and the strong interaction is favorable for suppressing the amount of entropic uncertainty. Furthermore, we presented two schemes by means of prior weak measurement and posterior weak measurement reversal to control the amount of entropic uncertainty of Pauli observables in dissipative environments. The numerical results show that the prior weak measurement can effectively reduce the wave peak values of the QMA-EUA dynamic process in non-Markovian environment for long periods of time, but it is ineffectual on the wave minima of dynamic process. However, the posterior weak measurement reversal has an opposite effects on the dynamic process. Moreover, the success probability entirely depends on the quantum measurement strength. We hope that our proposal could be verified experimentally and might possibly have future applications in quantum information processing.

  4. Differentiating intolerance of uncertainty from three related but distinct constructs.

    PubMed

    Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel

    2014-01-01

    Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.

  5. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  6. Big data uncertainties.

    PubMed

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  7. A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders

    NASA Astrophysics Data System (ADS)

    Malik, Mashkoor; Lurton, Xavier; Mayer, Larry

    2018-06-01

    Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.

  8. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    PubMed

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  9. Growing the Seeds of Scientific Enquiry

    ERIC Educational Resources Information Center

    Deller, Clarysly

    2017-01-01

    As plants and seed dispersal are common themes in primary science, the author thought that she would share an enquiry challenge activity that addresses many of the "working scientifically" objectives of the National Curriculum for England. Year 3 and 4 children had a whole afternoon made up firstly of "playing", planning and…

  10. National Institutes of Health addresses the science of diversity.

    PubMed

    Valantine, Hannah A; Collins, Francis S

    2015-10-06

    The US biomedical research workforce does not currently mirror the nation's population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation's health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity's impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce.

  11. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    PubMed

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  12. A bootstrap method for estimating uncertainty of water quality trends

    USGS Publications Warehouse

    Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura

    2015-01-01

    Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.

  13. Calibration/Validation Error Budgets, Uncertainties, Traceability and Their Importance to Imaging Spectrometry

    NASA Technical Reports Server (NTRS)

    Thome, K.

    2016-01-01

    Knowledge of uncertainties and errors are essential for comparisons of remote sensing data across time, space, and spectral domains. Vicarious radiometric calibration is used to demonstrate the need for uncertainty knowledge and to provide an example error budget. The sample error budget serves as an example of the questions and issues that need to be addressed by the calibrationvalidation community as accuracy requirements for imaging spectroscopy data will continue to become more stringent in the future. Error budgets will also be critical to ensure consistency between the range of imaging spectrometers expected to be launched in the next five years.

  14. Scientific assessment of the quality of OSIRIS images

    NASA Astrophysics Data System (ADS)

    Tubiana, C.; Güttler, C.; Kovacs, G.; Bertini, I.; Bodewits, D.; Fornasier, S.; Lara, L.; La Forgia, F.; Magrin, S.; Pajola, M.; Sierks, H.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Agarwal, J.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Boudreault, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; De Cecco, M.; El-Maarry, M. R.; Fulle, M.; Groussin, O.; Gutiérrez-Marques, P.; Gutiérrez, P. J.; Hoekzema, N.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lazzarin, M.; Lopez Moreno, J. J.; Marzari, F.; Massironi, M.; Michalik, H.; Moissl, R.; Naletto, G.; Oklay, N.; Scholten, F.; Shi, X.; Thomas, N.; Vincent, J.-B.

    2015-11-01

    Context. OSIRIS, the scientific imaging system onboard the ESA Rosetta spacecraft, has been imaging the nucleus of comet 67P/Churyumov-Gerasimenko and its dust and gas environment since March 2014. The images serve different scientific goals, from morphology and composition studies of the nucleus surface, to the motion and trajectories of dust grains, the general structure of the dust coma, the morphology and intensity of jets, gas distribution, mass loss, and dust and gas production rates. Aims: We present the calibration of the raw images taken by OSIRIS and address the accuracy that we can expect in our scientific results based on the accuracy of the calibration steps that we have performed. Methods: We describe the pipeline that has been developed to automatically calibrate the OSIRIS images. Through a series of steps, radiometrically calibrated and distortion corrected images are produced and can be used for scientific studies. Calibration campaigns were run on the ground before launch and throughout the years in flight to determine the parameters that are used to calibrate the images and to verify their evolution with time. We describe how these parameters were determined and we address their accuracy. Results: We provide a guideline to the level of trust that can be put into the various studies performed with OSIRIS images, based on the accuracy of the image calibration.

  15. The Development of Scientific Thinking in Elementary School: A Comprehensive Inventory

    ERIC Educational Resources Information Center

    Koerber, Susanne; Mayer, Daniela; Osterhaus, Christopher; Schwippert, Knut; Sodian, Beate

    2015-01-01

    The development of scientific thinking was assessed in 1,581 second, third, and fourth graders (8-, 9-, 10-year-olds) based on a conceptual model that posits developmental progression from naïve to more advanced conceptions. Using a 66-item scale, five components of scientific thinking were addressed, including experimental design, data…

  16. FIFRA Scientific Advisory Panel Minutes No. 21015-04. A set of scientific issues being considered by the Environmental Protection Agency regarding integrated endocrine bioactivity and exposure-based prioritization & screening

    USDA-ARS?s Scientific Manuscript database

    On December 2-4, 2014, the US Environmental Protection Agency convened a public meeting of the FIFRA Scientific Advisory Panel (SAP) to address scientific issues associated with the agency’s “Integrated Endocrine Bioactivity and Exposure-Based Prioritization and Screening” methods. EPA is proposing ...

  17. Communicating Uncertain Science to the Public: How Amount and Source of Uncertainty Impact Fatalism, Backlash, and Overload

    PubMed Central

    Jensen, Jakob D.; Pokharel, Manusheela; Scherr, Courtney L.; King, Andy J.; Brown, Natasha; Jones, Christina

    2016-01-01

    Public dissemination of scientific research often focuses on the finding (e.g., nanobombs kill lung cancer) rather than the uncertainty/limitations (e.g., in mice). Adults (N = 880) participated in an experiment where they read a manipulated news report about cancer research (a) that contained either low or high uncertainty (b) that was attributed to the scientists responsible for the research (disclosure condition) or an unaffiliated scientist (dueling condition). Compared to the dueling condition, the disclosure condition triggered less prevention-focused cancer fatalism and nutritional backlash. PMID:26973157

  18. Communicating Uncertain Science to the Public: How Amount and Source of Uncertainty Impact Fatalism, Backlash, and Overload.

    PubMed

    Jensen, Jakob D; Pokharel, Manusheela; Scherr, Courtney L; King, Andy J; Brown, Natasha; Jones, Christina

    2017-01-01

    Public dissemination of scientific research often focuses on the finding (e.g., nanobombs kill lung cancer) rather than the uncertainty/limitations (e.g., in mice). Adults (n = 880) participated in an experiment where they read a manipulated news report about cancer research (a) that contained either low or high uncertainty (b) that was attributed to the scientists responsible for the research (disclosure condition) or an unaffiliated scientist (dueling condition). Compared to the dueling condition, the disclosure condition triggered less prevention-focused cancer fatalism and nutritional backlash. © 2016 Society for Risk Analysis.

  19. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  20. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef M.; Coles, T.; Spantini, A.

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local andmore » long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves

  1. Scientific Integrity: The Need for Government Standards

    NASA Astrophysics Data System (ADS)

    McPhaden, Michael J.

    2010-11-01

    The U.S. government makes substantial investments in scientific research that address the nation’s need for accurate and authoritative information to guide federal policy decisions. Therefore, there is a lot at stake in having a consistent and explicit federal policy on scientific integrity to increase transparency and build trust in government science. Scientific integrity is an issue that applies not only to individual scientists working within the federal system but also to government agencies in how they use scientific information to formulate policy. The White House issued a memorandum on scientific integrity in March 2009, and it is regrettable that it has taken so much longer than the 120 days stipulated in the president's memo for the release of recommendations by the Office of Science and Technology Policy (OSTP) (see related news item in this issue). While it is also understandable given the welter of different agencies and organizations that make up the executive branch of the government, AGU urges that these recommendations be finalized and published as soon as possible.

  2. Simple uncertainty propagation for early design phase aircraft sizing

    NASA Astrophysics Data System (ADS)

    Lenz, Annelise

    Many designers and systems analysts are aware of the uncertainty inherent in their aircraft sizing studies; however, few incorporate methods to address and quantify this uncertainty. Many aircraft design studies use semi-empirical predictors based on a historical database and contain uncertainty -- a portion of which can be measured and quantified. In cases where historical information is not available, surrogate models built from higher-fidelity analyses often provide predictors for design studies where the computational cost of directly using the high-fidelity analyses is prohibitive. These surrogate models contain uncertainty, some of which is quantifiable. However, rather than quantifying this uncertainty, many designers merely include a safety factor or design margin in the constraints to account for the variability between the predicted and actual results. This can become problematic if a designer does not estimate the amount of variability correctly, which then can result in either an "over-designed" or "under-designed" aircraft. "Under-designed" and some "over-designed" aircraft will likely require design changes late in the process and will ultimately require more time and money to create; other "over-designed" aircraft concepts may not require design changes, but could end up being more costly than necessary. Including and propagating uncertainty early in the design phase so designers can quantify some of the errors in the predictors could help mitigate the extent of this additional cost. The method proposed here seeks to provide a systematic approach for characterizing a portion of the uncertainties that designers are aware of and propagating it throughout the design process in a procedure that is easy to understand and implement. Using Monte Carlo simulations that sample from quantified distributions will allow a systems analyst to use a carpet plot-like approach to make statements like: "The aircraft is 'P'% likely to weigh 'X' lbs or less, given the

  3. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  4. Administrative automation in a scientific environment

    NASA Technical Reports Server (NTRS)

    Jarrett, J. R.

    1984-01-01

    Although the scientific personnel at GSFC were advanced in the development and use of hardware and software for scientific applications, resistance to the use of automation or purchase of terminals, software and services, specifically for administrative functions was widespread. The approach used to address problems and constraints and plans for administrative automation within the Space and Earth Sciences Directorate are delineated. Accomplishments thus far include reduction of paperwork and manual efforts; improved communications through telemail and committees; additional support staff; increased awareness at all levels on ergonomic concerns and the need for training; better equipment; improved ADP skills through experience; management commitment; and an overall strategy for automating.

  5. An Evaluation of Curriculum Materials Based Upon the Socio-Scientific Reasoning Model.

    ERIC Educational Resources Information Center

    Henkin, Gayle; And Others

    To address the need to develop a scientifically literate citizenry, the socio-scientific reasoning model was created to guide curriculum development. Goals of this developmental approach include increasing: (1) students' skills in dealing with problems containing multiple interacting variables; (2) students' decision-making skills incorporating a…

  6. “Wrong, but Useful”: Negotiating Uncertainty in Infectious Disease Modelling

    PubMed Central

    Christley, Robert M.; Mort, Maggie; Wynne, Brian; Wastling, Jonathan M.; Heathwaite, A. Louise; Pickup, Roger; Austin, Zoë; Latham, Sophia M.

    2013-01-01

    For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be ‘used’. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to ‘interpretive flexibility’. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts. PMID:24146851

  7. Demystifying Scientific Data ­ Using Earth Science to Teach the Scientific Method

    NASA Astrophysics Data System (ADS)

    Nassiff, P. J.; Santos, E. A.; Erickson, P. J.; Niell, A. E.

    2006-12-01

    The collection of large quantities of data and their subsequent analyses are important components of any scientific process, particularly at research institutes such as MIT's Haystack Observatory, where the collection and analyses of data is crucial to research efforts. Likewise, a recent study on science education concluded that students should be introduced to analyzing evidence and hypotheses, to critical thinking - including appropriate skepticism, to quantitative reasoning and the ability to make reasonable estimates, and to the role of uncertainty and error in science. In order to achieve this goal with grades 9-12 students and their instructors, we developed lesson plans and activities based on atmospheric science and geodetic research at Haystack Observatory. From the complex steps of experimental design, measurement, and data analysis, students and teachers will gain insight into the scientific research processes as they exist today. The use of these space weather and geodesy activities in classrooms will be discussed. Space Weather: After decades of data collection with multiple variables, space weather is about as complex an area of investigation as possible. Far from the passive relationship between the Sun and Earth often taught in the early grades, or the beautiful auroras discussed in high school, there are complex and powerful interactions between the Sun and Earth. In spite of these complexities, high school students can learn about space weather and the repercussions on our communication and power technologies. Starting from lessons on the basic method of observing space weather with incoherent scatter radar, and progressing to the use of simplified data sets, students will discover how space weather affects Earth over solar cycles and how severe solar activity is measured and affects the Earth over shorter time spans. They will see that even from complex, seemingly ambiguous data with many variables and unknowns, scientists can gain valuable

  8. Scientific Approaches | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    CPTAC employs two complementary scientific approaches, a "Targeting Genome to Proteome" (Targeting G2P) approach and a "Mapping Proteome to Genome" (Mapping P2G) approach, in order to address biological questions from data generated on a sample.

  9. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  10. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  11. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  12. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models

  13. Environmental transmission of generalized anxiety disorder from parents to children: worries, experiential avoidance, and intolerance of uncertainty

    PubMed Central

    Aktar, Evin; Nikolić, Milica; Bögels, Susan M.

    2017-01-01

    Generalized anxiety disorder (GAD) runs in families. Building on recent theoretical approaches, this review focuses on potential environmental pathways for parent-to-child transmission of GAD. First, we address child acquisition of a generalized pattern of fearful/anxious and avoidant responding to potential threat from parents via verbal information and via modeling. Next, we address how parenting behaviors may contribute to maintenance of fearful/anxious and avoidant reactions in children. Finally, we consider intergenerational transmission of worries as a way of coping with experiential avoidance of strong negative emotions and with intolerance of uncertainty. We conclude that parents with GAD may bias their children's processing of potential threats in the environment by conveying the message that the world is not safe, that uncertainty is intolerable, that strong emotions should be avoided, and that worry helps to cope with uncertainty, thereby transmitting cognitive styles that characterize GAD. Our review highlights the need for research on specific pathways for parent-to-child transmission of GAD. PMID:28867938

  14. Environmental transmission of generalized anxiety disorder from parents to children: worries, experiential avoidance, and intolerance of uncertainty.

    PubMed

    Aktar, Evin; Nikolić, Milica; Bögels, Susan M

    2017-06-01

    Generalized anxiety disorder (GAD) runs in families. Building on recent theoretical approaches, this review focuses on potential environmental pathways for parent-to-child transmission of GAD. First, we address child acquisition of a generalized pattern of fearful/anxious and avoidant responding to potential threat from parents via verbal information and via modeling. Next, we address how parenting behaviors may contribute to maintenance of fearful/anxious and avoidant reactions in children. Finally, we consider intergenerational transmission of worries as a way of coping with experiential avoidance of strong negative emotions and with intolerance of uncertainty. We conclude that parents with GAD may bias their children's processing of potential threats in the environment by conveying the message that the world is not safe, that uncertainty is intolerable, that strong emotions should be avoided, and that worry helps to cope with uncertainty, thereby transmitting cognitive styles that characterize GAD. Our review highlights the need for research on specific pathways for parent-to-child transmission of GAD.

  15. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  16. Fleck and the social constitution of scientific objectivity.

    PubMed

    Fagan, Melinda B

    2009-12-01

    Ludwik Fleck's theory of thought-styles has been hailed as a pioneer of constructivist science studies and sociology of scientific knowledge. But this consensus ignores an important feature of Fleck's epistemology. At the core of his account is the ideal of 'objective truth, clarity, and accuracy'. I begin with Fleck's account of modern natural science, locating the ideal of scientific objectivity within his general social epistemology. I then draw on Fleck's view of scientific objectivity to improve upon reflexive accounts of the origin and development of the theory of thought-styles, and reply to objections that Fleck's epistemological stance is self-undermining or inconsistent. Explicating the role of scientific objectivity in Fleck's epistemology reveals his view to be an internally consistent alternative to recent social accounts of scientific objectivity by Harding, Daston and Galison. I use these contrasts to indicate the strengths and weaknesses of Fleck's innovative social epistemology, and propose modifications to address the latter. The result is a renewed version of Fleck's social epistemology, which reconciles commitment to scientific objectivity with integrated sociology, history and philosophy of science.

  17. Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment

    NASA Astrophysics Data System (ADS)

    Tesfamichael, A. A.; Jagath, K. J.

    2004-12-01

    Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.

  18. A scientific assessment of a new technology orbital telescope

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As part of a program designed to test the Alpha chemical laser weapons system in space, the Ballistic Missile Defense Organization (BMDO) developed components of an agile, lightweight, 4-meter telescope, equipped with an advanced active-optics system. BMDO had proposed to make space available in the telescope's focal plane for instrumentation optimized for scientific applications in astrophysics and planetary astronomy for a potential flight mission. Such a flight mission could be undertaken if new or additional sponsorship can be found. Despite this uncertainty, BMDO requested assistance in defining the instrumentation and other design aspects necessary to enhance the scientific value of a pointing and tracking mission. In response to this request, the Space Studies Board established the Task Group on BMDO New Technology Orbital Observatory (TGBNTOO) and charged it to: (1) provide instrumentation, data management, and science-operations advice to BMDO to optimize the scientific value of a 4-meter mission; and (2) support a space studies board assessment of the relative scientific merit of the program. This report deals with the first of these tasks, assisting the Advanced Technology Demonstrator's (ATD's) program scientific potential. Given the potential scientific aspects of the 4-meter telescope, this project is referred to as the New Technology Orbital Telescope (NTOT), or as the ATD/NTOT, to emphasize its dual-use character. The task group's basic conclusion is that the ATD/NTOT mission does have the potential for contributing in a major way to astronomical goals.

  19. Patterns in Students' Argumentation Confronted with a Risk-focused Socio-scientific Issue

    NASA Astrophysics Data System (ADS)

    Kolstø, Stein Dankert

    2006-11-01

    This paper reports a qualitative study on students’ informal reasoning on a controversial socio-scientific issue. Twenty-two students from four science classes in Norway were interviewed about the local construction of new power lines and the possible increased risk of childhood leukaemia. The focus in the study is on what arguments the students employ when asked about their decision-making and the interplay between knowledge and personal values. Five different types of main arguments are identified: the relative risk argument, the precautionary argument, the uncertainty argument, the small risk argument, and the pros and cons argument. These arguments are presented through case studies, and crucial information and values are identified for each argument. The students made use of a range of both scientific and non-scientific knowledge. The findings are discussed in relation to possible consequences for teaching models aimed at increasing students’ ability to make thoughtful decisions on socio-scientific issues.

  20. Using structured decision making with landowners to address private forest management and parcelization: balancing multiple objectives and incorporating uncertainty

    Treesearch

    Paige F. B. Ferguson; Michael J. Conroy; John F. Chamblee; Jeffrey Hepinstall-Cymerman

    2015-01-01

    Parcelization and forest fragmentation are of concern for ecological, economic, and social reasons. Efforts to keep large, private forests intact may be supported by a decision-making process that incorporates landowners’ objectives and uncertainty. We used structured decision making (SDM) with owners of large, private forests in Macon County, North Carolina....

  1. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Kaiser, Maria; Špačková, Olga; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2018-05-01

    Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  2. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  3. Introducing Risk Analysis and Calculation of Profitability under Uncertainty in Engineering Design

    ERIC Educational Resources Information Center

    Kosmopoulou, Georgia; Freeman, Margaret; Papavassiliou, Dimitrios V.

    2011-01-01

    A major challenge that chemical engineering graduates face at the modern workplace is the management and operation of plants under conditions of uncertainty. Developments in the fields of industrial organization and microeconomics offer tools to address this challenge with rather well developed concepts, such as decision theory and financial risk…

  4. International survey for good practices in forecasting uncertainty assessment and communication

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty

  5. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  6. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  7. When can scientific studies promote consensus among conflicting stakeholders?

    PubMed

    Small, Mitchell J; Güvenç, Ümit; DeKay, Michael L

    2014-11-01

    While scientific studies may help conflicting stakeholders come to agreement on a best management option or policy, often they do not. We review the factors affecting trust in the efficacy and objectivity of scientific studies in an analytical-deliberative process where conflict is present, and show how they may be incorporated in an extension to the traditional Bayesian decision model. The extended framework considers stakeholders who differ in their prior beliefs regarding the probability of possible outcomes (in particular, whether a proposed technology is hazardous), differ in their valuations of these outcomes, and differ in their assessment of the ability of a proposed study to resolve the uncertainty in the outcomes and their hazards--as measured by their perceived false positive and false negative rates for the study. The Bayesian model predicts stakeholder-specific preposterior probabilities of consensus, as well as pathways for increasing these probabilities, providing important insights into the value of scientific information in an analytic-deliberative decision process where agreement is sought. It also helps to identify the interactions among perceived risk and benefit allocations, scientific beliefs, and trust in proposed scientific studies when determining whether a consensus can be achieved. The article provides examples to illustrate the method, including an adaptation of a recent decision analysis for managing the health risks of electromagnetic fields from high voltage transmission lines. © 2014 Society for Risk Analysis.

  8. Enhancing Scientific Foundations to Ensure Reproducibility: A New Paradigm.

    PubMed

    Hsieh, Terry; Vaickus, Max H; Remick, Daniel G

    2018-01-01

    Progress in science is dependent on a strong foundation of reliable results. The publish or perish paradigm in research, coupled with an increase in retracted articles from the peer-reviewed literature, is beginning to erode the trust of both the scientific community and the public. The NIH is combating errors by requiring investigators to follow new guidelines addressing scientific premise, experimental design, biological variables, and authentication of reagents. Herein, we discuss how implementation of NIH guidelines will help investigators proactively address pitfalls of experimental design and methods. Careful consideration of the variables contributing to reproducibility helps ensure robust results. The NIH, investigators, and journals must collaborate to ensure that quality science is funded, explored, and published. Copyright © 2018 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  9. Optimization Control of the Color-Coating Production Process for Model Uncertainty

    PubMed Central

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563

  10. Optimization Control of the Color-Coating Production Process for Model Uncertainty.

    PubMed

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.

  11. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  12. Scientific Synergy between LSST and Euclid

    NASA Astrophysics Data System (ADS)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja

    2017-12-01

    Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.

  13. Investments in energy technological change under uncertainty

    NASA Astrophysics Data System (ADS)

    Shittu, Ekundayo

    2009-12-01

    This dissertation addresses the crucial problem of how environmental policy uncertainty influences investments in energy technological change. The rising level of carbon emissions due to increasing global energy consumption calls for policy shift. In order to stem the negative consequences on the climate, policymakers are concerned with carving an optimal regulation that will encourage technology investments. However, decision makers are facing uncertainties surrounding future environmental policy. The first part considers the treatment of technological change in theoretical models. This part has two purposes: (1) to show--through illustrative examples--that technological change can lead to quite different, and surprising, impacts on the marginal costs of pollution abatement. We demonstrate an intriguing and uncommon result that technological change can increase the marginal costs of pollution abatement over some range of abatement; (2) to show the impact, on policy, of this uncommon observation. We find that under the assumption of technical change that can increase the marginal cost of pollution abatement over some range, the ranking of policy instruments is affected. The second part builds on the first by considering the impact of uncertainty in the carbon tax on investments in a portfolio of technologies. We determine the response of energy R&D investments as the carbon tax increases both in terms of overall and technology-specific investments. We determine the impact of risk in the carbon tax on the portfolio. We find that the response of the optimal investment in a portfolio of technologies to an increasing carbon tax depends on the relative costs of the programs and the elasticity of substitution between fossil and non-fossil energy inputs. In the third part, we zoom-in on the portfolio model above to consider how uncertainty in the magnitude and timing of a carbon tax influences investments. Under a two-stage continuous-time optimal control model, we

  14. The Difference between Uncertainty and Information, and Why This Matters

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2016-12-01

    Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.

  15. Precision pointing of scientific instruments on space station: The LFGGREC perspective

    NASA Technical Reports Server (NTRS)

    Blackwell, C. C.; Sirlin, S. W.; Laskin, R. A.

    1988-01-01

    An application of Lyapunov function-gradient-generated robustness-enhancing control (LFGGREC) is explored. The attention is directed to a reduced-complexity representation of the pointing problem presented by the system composed of the Space Infrared Telescope Facility gimbaled to a space station configuration. Uncertainties include disturbance forces applied in the crew compartment area and control moments applied to adjacent scientific payloads (modeled as disturbance moments). Also included are uncertainties in gimbal friction and in the structural component of the system, as reflected in the inertia matrix, the damping matrix, and the stiffness matrix, and the effect of the ignored vibrational dynamics of the structure. The emphasis is on the adaptation of LFGGREC to this particular configuration and on the robustness analysis.

  16. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  17. Fourth International Conference on Squeezed States and Uncertainty Relations

    NASA Technical Reports Server (NTRS)

    Han, D. (Editor); Peng, Kunchi (Editor); Kim, Y. S. (Editor); Manko, V. I. (Editor)

    1996-01-01

    The fourth International Conference on Squeezed States and Uncertainty Relations was held at Shanxi University, Taiyuan, Shanxi, China, on June 5 - 9, 1995. This conference was jointly organized by Shanxi University, the University of Maryland (U.S.A.), and the Lebedev Physical Institute (Russia). The first meeting of this series was called the Workshop on Squeezed States and Uncertainty Relations, and was held in 1991 at College Park, Maryland. The second and third meetings in this series were hosted in 1992 by the Lebedev Institute in Moscow, and in 1993 by the University of Maryland Baltimore County, respectively. The scientific purpose of this series was initially to discuss squeezed states of light, but in recent years, the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics, including, of course, quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic transformation. This transition took place at the fourth meeting of this series held at Shanxi University in 1995. The fifth meeting in this series will be held in Budapest (Hungary) in 1997, and the principal organizer will be Jozsef Janszky of the Laboratory of Crystal Physics, P.O. Box 132, H-1052. Budapest, Hungary.

  18. National Institutes of Health addresses the science of diversity

    PubMed Central

    Valantine, Hannah A.; Collins, Francis S.

    2015-01-01

    The US biomedical research workforce does not currently mirror the nation’s population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation’s health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity’s impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce. PMID:26392553

  19. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program

  20. Analyzing Uncertainty and Risk in the Management of Water Resources in the State Of Texas

    NASA Astrophysics Data System (ADS)

    Singh, A.; Hauffpauir, R.; Mishra, S.; Lavenue, M.

    2010-12-01

    The State of Texas updates its state water plan every five years to determine the water demand required to meet its growing population. The plan compiles forecasts of water deficits from state-wide regional water planning groups as well as the water supply strategies to address these deficits. To date, the plan has adopted a deterministic framework, where reference values (e.g., best estimates, worst-case scenario) are used for key factors such as population growth, demand for water, severity of drought, water availability, etc. These key factors can, however, be affected by multiple sources of uncertainties such as - the impact of climate on surface water and groundwater availability, uncertainty in population projections, changes in sectoral composition of the economy, variability in water usage, feasibility of the permitting process, cost of implementation, etc. The objective of this study was to develop a generalized and scalable methodology for addressing uncertainty and risk in water resources management both at the regional and the local water planning level. The study proposes a framework defining the elements of an end-to-end system model that captures the key components of demand, supply and planning modules along with their associated uncertainties. The framework preserves the fundamental elements of the well-established planning process in the State of Texas, promoting an incremental and stakeholder-driven approach to adding different levels of uncertainty (and risk) into the decision-making environment. The uncertainty in the water planning process is broken down into two primary categories: demand uncertainty and supply uncertainty. Uncertainty in Demand is related to the uncertainty in population projections and the per-capita usage rates. Uncertainty in Supply, in turn, is dominated by the uncertainty in future climate conditions. Climate is represented in terms of time series of precipitation, temperature and/or surface evaporation flux for some

  1. Facebook use during relationship termination: uncertainty reduction and surveillance.

    PubMed

    Tong, Stephanie Tom

    2013-11-01

    Many studies document how individuals use Facebook to meet partners or develop and maintain relationships. Less is known about information-seeking behaviors during the stages of relationship termination. Relational dissolution is a socially embedded activity, and affordances of social network sites offer many advantages in reducing uncertainty after a breakup. A survey collected responses from 110 individuals who use Facebook to gather information about their romantic ex-partners. Results indicated that after breakup, partners may take advantage of the system's information visibility and the relative invisibility of movement depending on relational factors (initiator role and breakup uncertainty), social factors (perceived network approval of Facebook surveillance), and individual privacy concerns. This investigation addresses questions such as what type of information-seeking foci do individuals employ and how do individuals use Facebook as a form of surveillance? What factors motivate surveillance behavior?

  2. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  3. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  4. Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops

    NASA Astrophysics Data System (ADS)

    Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said

    2017-11-01

    The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.

  5. Risk intelligence: making profit from uncertainty in data processing system.

    PubMed

    Zheng, Si; Liao, Xiangke; Liu, Xiaodong

    2014-01-01

    In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput.

  6. Risk Intelligence: Making Profit from Uncertainty in Data Processing System

    PubMed Central

    Liao, Xiangke; Liu, Xiaodong

    2014-01-01

    In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392

  7. Addressing spatial scales and new mechanisms in climate impact ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.

    2015-12-01

    Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.

  8. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  9. The US-DOE ARM/ASR Effort in Quantifying Uncertainty in Ground-Based Cloud Property Retrievals (Invited)

    NASA Astrophysics Data System (ADS)

    Xie, S.; Protat, A.; Zhao, C.

    2013-12-01

    One primary goal of the US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program is to obtain and retrieve cloud microphysical properties from detailed cloud observations using ground-based active and passive remote sensors. However, there is large uncertainty in the retrieved cloud property products. Studies have shown that the uncertainty could arise from instrument limitations, measurement errors, sampling errors, retrieval algorithm deficiencies in assumptions, as well as inconsistent input data and constraints used by different algorithms. To quantify the uncertainty in cloud retrievals, a scientific focus group, Quantification of Uncertainties In Cloud Retrievals (QUICR), was recently created by the DOE Atmospheric System Research (ASR) program. This talk will provide an overview of the recent research activities conducted within QUICR and discuss its current collaborations with the European cloud retrieval community and future plans. The goal of QUICR is to develop a methodology for characterizing and quantifying uncertainties in current and future ARM cloud retrievals. The Work at LLNL was performed under the auspices of the U. S. Department of Energy (DOE), Office of Science, Office of Biological and Environmental Research by Lawrence Livermore National Laboratory under contract No. DE-AC52-07NA27344. LLNL-ABS-641258.

  10. Keynote Address: Science Since the Medicean Stars and the Beagle

    NASA Astrophysics Data System (ADS)

    Partridge, B.; Hillenbrand, L. A.; Grinspoon, D.

    2010-08-01

    In 2009, the world celebrates both the International Year of Astronomy (IYA), commemorating the 400th anniversary of Galileo's first observations of the heavens with his telescope, and the 200th anniversary of the birth of Charles Darwin and the 150th anniversary of the publication of his Origin of Species, a key impetus for the 2009 Year of Science. In this keynote address, the three presenters (distinguished scientists themselves) will reflect on how these recent centuries of astronomical and scientific discovery have changed our perspectives about the universe, the natural world, and ourselves—and underpin our education and public outreach efforts to help ensure continued scientific advance in the future.

  11. Managing the uncertainties of the streamflow data produced by the French national hydrological services

    NASA Astrophysics Data System (ADS)

    Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level

  12. Evaluation of the Uncertainty in JP-7 Kinetics Models Applied to Scramjets

    NASA Technical Reports Server (NTRS)

    Norris, A. T.

    2017-01-01

    One of the challenges of designing and flying a scramjet-powered vehicle is the difficulty of preflight testing. Ground tests at realistic flight conditions introduce several sources of uncertainty to the flow that must be addressed. For example, the scales of the available facilities limit the size of vehicles that can be tested and so performance metrics for larger flight vehicles must be extrapolated from ground tests at smaller scales. To create the correct flow enthalpy for higher Mach number flows, most tunnels use a heater that introduces vitiates into the flow. At these conditions, the effects of the vitiates on the combustion process is of particular interest to the engine designer, where the ground test results must be extrapolated to flight conditions. In this paper, the uncertainty of the cracked JP-7 chemical kinetics used in the modeling of a hydrocarbon-fueled scramjet was investigated. The factors that were identified as contributing to uncertainty in the combustion process were the level of flow vitiation, the uncertainty of the kinetic model coefficients and the variation of flow properties between ground testing and flight. The method employed was to run simulations of small, unit problems and identify which variables were the principal sources of uncertainty for the mixture temperature. Then using this resulting subset of all the variables, the effects of the uncertainty caused by the chemical kinetics on a representative scramjet flow-path for both vitiated (ground) and nonvitiated (flight) flows were investigated. The simulations showed that only a few of the kinetic rate equations contribute to the uncertainty in the unit problem results, and when applied to the representative scramjet flowpath, the resulting temperature variability was on the order of 100 K. Both the vitiated and clean air results showed very similar levels of uncertainty, and the difference between the mean properties were generally within the range of uncertainty predicted.

  13. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  14. Characterizing bias correction uncertainty in wheat yield predictions

    NASA Astrophysics Data System (ADS)

    Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam

    2017-04-01

    Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield

  15. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  16. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kappeler, Franz

    2010-11-09

    F. Kappeler speaks about EFNUDAT synergies in astrophysics in this second session of the Final Scientific EFNUDAT Workshop. The workshop was organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities; International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany)S. Pomp (TSLmore » UU, Sweden);Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.« less

  17. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  18. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is

  19. Uncertainty in mixing models: a blessing in disguise?

    NASA Astrophysics Data System (ADS)

    Delsman, J. R.; Oude Essink, G. H. P.

    2012-04-01

    Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.

  20. Uncertainty Quantification in Geomagnetic Field Modeling

    NASA Astrophysics Data System (ADS)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  1. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  2. Effects of Uncertainty on ERPs to Emotional Pictures Depend on Emotional Valence

    PubMed Central

    Lin, Huiyan; Jin, Hua; Liang, Jiafeng; Yin, Ruru; Liu, Ting; Wang, Yiwen

    2015-01-01

    Uncertainty about the emotional content of an upcoming event has found to modulate neural activity to the event before its occurrence. However, it is still under debate whether the uncertainty effects occur after the occurrence of the event. To address this issue, participants were asked to view emotional pictures that were shortly after a cue, which either indicated a certain emotion of the picture or not. Both certain and uncertain cues were used by neutral symbols. The anticipatory phase (i.e., inter-trial interval, ITI) between the cue and the picture was short to enhance the effects of uncertainty. In addition, we used positive and negative pictures that differed only in valence but not in arousal to investigate whether the uncertainty effect was dependent on emotional valence. Electroencephalography (EEG) was recorded during the presentation of the pictures. Event-related potential (ERP) results showed that negative pictures evoked smaller P2 and late LPP but larger N2 in the uncertain as compared to the certain condition; whereas we did not find the uncertainty effect in early LPP. For positive pictures, the early LPP was larger in the uncertain as compared to the certain condition; however, there were no uncertainty effects in some other ERP components (e.g., P2, N2, and late LPP). The findings suggest that uncertainty modulates neural activity to emotional pictures and this modulation is altered by the valence of the pictures, indicating that individuals alter the allocation of attentional resources toward uncertain emotional pictures dependently on the valence of the pictures. PMID:26733916

  3. Reducing patients' anxiety and uncertainty, and improving recall in bad news consultations.

    PubMed

    van Osch, Mara; Sep, Milou; van Vliet, Liesbeth M; van Dulmen, Sandra; Bensing, Jozien M

    2014-11-01

    Patients' recall of provided information during bad news consultations is poor. According to the attentional narrowing hypothesis, the emotional arousal caused by the bad news might be responsible for this hampered information processing. Because affective communication has proven to be effective in tempering patients' emotional reactions, the current study used an experimental design to explore whether physician's affective communication in bad news consultations decreases patients' anxiety and uncertainty and improves information recall. Two scripted video-vignettes of a bad news consultation were used in which the physician's verbal communication was manipulated (standard vs. affective condition). Fifty healthy women (i.e., analogue patients) randomly watched 1 of the 2 videos. The effect of communication on participants' anxiety, uncertainty, and recall was assessed by self-report questionnaires. Additionally, a moderator analysis was performed. Affective communication reduced anxiety (p = .01) and uncertainty (p = .04), and improved recall (p = .05), especially for information about prognosis (p = .04) and, to some extent, for treatment options (p = .07). The moderating effect of (reduced) anxiety and uncertainty on recall could not be confirmed and showed a trend for uncertainty. Physicians' affective communication can temper patients' anxiety and uncertainty during bad news consultations, and enhance their ability to recall medical information. The reduction of anxiety and uncertainty could not explain patients' enhanced recall, which leaves the underlying mechanism unspecified. Our findings underline the importance of addressing patients' emotions and provide empirical support to incorporate this in clinical guidelines and recommendations. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  5. Selecting and implementing scientific objectives. [for Voyager 1 and 2 planetary encounters

    NASA Technical Reports Server (NTRS)

    Miner, E. D.; Stembridge, C. H.; Doms, P. E.

    1985-01-01

    The procedures used to select and implement scientific objectives for the Voyager 1 and 2 planetary encounters are described. Attention is given to the scientific tradeoffs and engineering considerations must be addressed at various stages in the mission planning process, including: the limitations of ground and spacecraft communications systems, ageing of instruments in flight, and instrument calibration over long distances. The contribution of planetary science workshops to the definition of scientific objectives for deep space missions is emphasized.

  6. Communicating Uncertainty about Climate Change for Application to Security Risk Management

    NASA Astrophysics Data System (ADS)

    Gulledge, J. M.

    2011-12-01

    The science of climate change has convincingly demonstrated that human activities, including the release of greenhouse gases, land-surface changes, particle emissions, and redistribution of water, are changing global and regional climates. Consequently, key institutions are now concerned about the potential social impacts of climate change. For example, the 2010 Quadrennial Defense Review Report from the U.S. Department of Defense states that "climate change, energy security, and economic stability are inextricably linked." Meanwhile, insured losses from climate and weather-related natural disasters have risen dramatically over the past thirty years. Although these losses stem largely from socioeconomic trends, insurers are concerned that climate change could exacerbate this trend and render certain types of climate risk non-diversifiable. Meanwhile, the climate science community-broadly defined as physical, biological, and social scientists focused on some aspect of climate change-remains largely focused scholarly activities that are valued in the academy but not especially useful to decision makers. On the other hand, climate scientists who engage in policy discussions have generally permitted vested interests who support or oppose climate policies to frame the discussion of climate science within the policy arena. Such discussions focus on whether scientific uncertainties are sufficiently resolved to justify policy and the vested interests overstate or understate key uncertainties to support their own agendas. Consequently, the scientific community has become absorbed defending scientific findings to the near exclusion of developing novel tools to aid in risk-based decision-making. For example, the Intergovernmental Panel on Climate Change (IPCC), established expressly for the purpose of informing governments, has largely been engaged in attempts to reduce unavoidable uncertainties rather than helping the world's governments define a science-based risk

  7. Robustness of Reconstructed Ancestral Protein Functions to Statistical Uncertainty.

    PubMed

    Eick, Geeta N; Bridgham, Jamie T; Anderson, Douglas P; Harms, Michael J; Thornton, Joseph W

    2017-02-01

    Hypotheses about the functions of ancient proteins and the effects of historical mutations on them are often tested using ancestral protein reconstruction (APR)-phylogenetic inference of ancestral sequences followed by synthesis and experimental characterization. Usually, some sequence sites are ambiguously reconstructed, with two or more statistically plausible states. The extent to which the inferred functions and mutational effects are robust to uncertainty about the ancestral sequence has not been studied systematically. To address this issue, we reconstructed ancestral proteins in three domain families that have different functions, architectures, and degrees of uncertainty; we then experimentally characterized the functional robustness of these proteins when uncertainty was incorporated using several approaches, including sampling amino acid states from the posterior distribution at each site and incorporating the alternative amino acid state at every ambiguous site in the sequence into a single "worst plausible case" protein. In every case, qualitative conclusions about the ancestral proteins' functions and the effects of key historical mutations were robust to sequence uncertainty, with similar functions observed even when scores of alternate amino acids were incorporated. There was some variation in quantitative descriptors of function among plausible sequences, suggesting that experimentally characterizing robustness is particularly important when quantitative estimates of ancient biochemical parameters are desired. The worst plausible case method appears to provide an efficient strategy for characterizing the functional robustness of ancestral proteins to large amounts of sequence uncertainty. Sampling from the posterior distribution sometimes produced artifactually nonfunctional proteins for sequences reconstructed with substantial ambiguity. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and

  8. Robust guaranteed cost tracking control of quadrotor UAV with uncertainties.

    PubMed

    Xu, Zhiwei; Nian, Xiaohong; Wang, Haibo; Chen, Yinsheng

    2017-07-01

    In this paper, a robust guaranteed cost controller (RGCC) is proposed for quadrotor UAV system with uncertainties to address set-point tracking problem. A sufficient condition of the existence for RGCC is derived by Lyapunov stability theorem. The designed RGCC not only guarantees the whole closed-loop system asymptotically stable but also makes the quadratic performance level built for the closed-loop system have an upper bound irrespective to all admissible parameter uncertainties. Then, an optimal robust guaranteed cost controller is developed to minimize the upper bound of performance level. Simulation results verify the presented control algorithms possess small overshoot and short setting time, with which the quadrotor has ability to perform set-point tracking task well. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Scientometric Analysis and Mapping of Scientific Articles on Diabetic Retinopathy.

    PubMed

    Ramin, Shahrokh; Gharebaghi, Reza; Heidary, Fatemeh

    2015-01-01

    Diabetic retinopathy (DR) is the major cause of blindness among the working-age population globally. No systematic research has been previously performed to analyze the research published on DR, despite the need for it. This study aimed to analyze the scientific production on DR to draw overall roadmap of future research strategic planning in this field. A bibliometric method was used to obtain a view on the scientific production about DR by the data extracted from the Institute for Scientific Information (ISI). Articles about DR published in 1993-2013 were analyzed to obtain a view of the topic's structure, history, and to document relationships. The trends in the most influential publications and authors were analyzed. Most highly cited articles addressed epidemiologic and translational research topics in this field. During the past 3 years, there has been a trend toward biomarker discovery and more molecular translational research. Areas such as gene therapy and micro-RNAs are also among the recent hot topics. Through analyzing the characteristics of papers and the trends in scientific production, we performed the first scientometric report on DR. Most influential articles have addressed epidemiology and translational research subjects in this field, which reflects that globally, the earlier diagnosis and treatment of this devastating disease still has the highest global priority.

  10. Honoring Our Ethical Origins: Scientific Integrity and Geoethics, Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Gundersen, L. C.

    2017-12-01

    Current ethics policy owes much of its origins to Aristotle and his writings on virtue - including the idea that if we understand and rationally practice virtue and excellence, we will be our best selves. From this humble beginning emerged a number of more complex, ever evolving, ethical theories. The Hypocratic Oath and atrocities of World War II resulted in the roots of scientific integrity through the Nuremberg Code and the Belmont Report, which set ethical rules for human experimentation, including, respect, beneficence, and justice. These roots produced bioethics, medical ethics, environmental ethics, and geoethics. Geoethics has its origins in Europe and is being embraced in the U.S.A. It needs a respected place in the geoscience curriculum, especially as we face the global challenges of climate change and sustainability. Modern scientific integrity in the U.S.A., where research misconduct is defined as fabrication, falsification, and plagiarism, was derived from efforts of the 1980's through 1990's by the Nat'l Institutes of Health and Nat'l Academy of Sciences (NAS). This definition of misconduct has remained an immovable standard, excluding anything not of the scientific process, such as personal behaviors within the research environment. Modern scientific integrity codes and reports such as the Singapore Statement, the NAS' Fostering Integrity in Research, and current federal agency policies, provide standards of behavior to aspire to, and acknowledge the deleterious effects of certain behaviors and practices, but still hesitate to include them in formal definitions of research misconduct. Modern media is holding a mirror to what is happening in the research environment. There are conflicts of interest, misrepresentations of data and uncertainty, discrimination, harassment, bullying, misuse of funds, withholding of data and code, intellectual theft, and a host of others, that are having a serious detrimental effect on science. For science to have its best

  11. A Comparative Study on Scientific Misconduct between Korean and Japanese Science Gifted Students

    ERIC Educational Resources Information Center

    Lee, Jiwon; Kim, Jung Bog; Isozaki, Tetsuo

    2017-01-01

    The scientific integrity, perceptions of scientific misconduct, and students' needs in the research ethics education of Korean and Japanese gifted students were analyzed to address three questions. First, how well do students practice research ethics in their research? Second, how do students perceive scientists' misconduct? Third, do students…

  12. A Scientific World in a Grain of Sand

    ERIC Educational Resources Information Center

    Clary, Renee; Wandersee, James

    2011-01-01

    Students investigate local sand samples on a shoestring budget. This investigation reveals a fascinating Earth history that can address various interdisciplinary scientific topics, provide rich inquiry experiences, and move beyond the science classroom to integrate history, culture, and art. (Contains 3 figures and 14 online resources.)

  13. Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'

    NASA Astrophysics Data System (ADS)

    Casola, J. H.; Huber, D.

    2013-12-01

    Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision

  14. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  15. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable

  16. On the uncertainty of interdisciplinarity measurements due to incomplete bibliographic data.

    PubMed

    Calatrava Moreno, María Del Carmen; Auzinger, Thomas; Werthner, Hannes

    The accuracy of interdisciplinarity measurements is directly related to the quality of the underlying bibliographic data. Existing indicators of interdisciplinarity are not capable of reflecting the inaccuracies introduced by incorrect and incomplete records because correct and complete bibliographic data can rarely be obtained. This is the case for the Rao-Stirling index, which cannot handle references that are not categorized into disciplinary fields. We introduce a method that addresses this problem. It extends the Rao-Stirling index to acknowledge missing data by calculating its interval of uncertainty using computational optimization. The evaluation of our method indicates that the uncertainty interval is not only useful for estimating the inaccuracy of interdisciplinarity measurements, but it also delivers slightly more accurate aggregated interdisciplinarity measurements than the Rao-Stirling index.

  17. Niches, models, and climate change: Assessing the assumptions and uncertainties

    PubMed Central

    Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.

    2009-01-01

    As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750

  18. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    PubMed

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  19. The contamination of scientific literature: looking for an antidote

    NASA Astrophysics Data System (ADS)

    Liotta, Marcello

    2017-04-01

    Science may have very strong implications for society. The knowledge of the processes occurring around the society represents a good opportunity to take responsible decisions. This is particularly true in the field of geosciences. Earthquakes, volcanic eruptions, landslides, climate changes and many other natural phenomena still need to be further investigated. The role of the scientific community is to increase the knowledge. Each member can share his own ideas and data thus allowing the entire scientific community to receive a precious contribution. The latter one often derives from research activities, which are expensive in terms of consumed time and resources. Nowadays the sharing of scientific results occurs through the publication on scientific journals. The reading of available scientific literature thus represents a unique opportunity to define the state of the art on a specific topic and to address research activities towards something new. When published results are obtained through a rigorous scientific process, they constitute a solid background where each member can add his ideas and evidences. Differently, published results may be affected by scientific misconduct; they constitute a labyrinth where the scientists lose their time in the attempt of truly understanding the natural processes. The normal scientific dialectic should unmask such results, thus avoiding literature contamination and making the scientific framework more stimulating. The scientific community should look for the best practice to reduce the risk of literature contamination.

  20. Conditional uncertainty principle

    NASA Astrophysics Data System (ADS)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  1. Scientific Synergy between LSST and Euclid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  2. Scientific Synergy between LSST and Euclid

    DOE PAGES

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...

    2017-12-07

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  3. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  4. A Formula for Fixing Troubled Projects: The Scientific Method Meets Leadership

    NASA Technical Reports Server (NTRS)

    Wagner, Sandra

    2006-01-01

    This presentation focuses on project management, specifically addressing project issues using the scientific method of problem-solving. Two sample projects where this methodology has been applied are provided.

  5. Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking

    NASA Astrophysics Data System (ADS)

    Groves, D. G.; Lempert, R.

    2008-12-01

    Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.

  6. 78 FR 6056 - Smokeless Tobacco Product Warning Statements; Request for Comments and Scientific Evidence

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... by scientific evidence, regarding what changes to the smokeless tobacco product warnings, if any... scientific evidence, regarding what changes, if any, to the smokeless tobacco product warnings would promote... supporting evidence should address how any changes in the warnings would affect both users' and nonusers...

  7. 77 FR 21158 - VA Directive 0005 on Scientific Integrity: Availability for Review and Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-09

    ... the Director, Office of Science and Technology Policy's Memorandum of December 17, 2010, on scientific integrity. It addresses how VA ensures quality science in its methods, review, policy application, and...: Background The Presidential Memorandum on Scientific Integrity and the Office of Science and Technology...

  8. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF

  9. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  10. Searching for scientific literacy and critical pedagogy in socioscientific curricula: A critical discourse analysis

    NASA Astrophysics Data System (ADS)

    Cummings, Kristina M.

    The omnipresence of science and technology in our society require the development of a critical and scientifically literate citizenry. However, the inclusion of socioscientific issues, which are open-ended controversial issues informed by both science and societal factors such as politics, economics, and ethics, do not guarantee the development of these skills. The purpose of this critical discourse analysis is to identify and analyze the discursive strategies used in intermediate science texts and curricula that address socioscientific topics and the extent to which the discourses are designed to promote or suppress the development of scientific literacy and a critical pedagogy. Three curricula that address the issue of energy and climate change were analyzed using Gee's (2011) building tasks and inquiry tools. The curricula were written by an education organization entitled PreSEES, a corporate-sponsored group called NEED, and a non-profit organization named Oxfam. The analysis found that the PreSEES and Oxfam curricula elevated the significance of climate change and the NEED curriculum deemphasized the issue. The PreSEES and Oxfam curricula promoted the development of scientific literacy while the NEED curricula suppressed its development. The PreSEES and Oxfam curricula both promoted the development of the critical pedagogy; however, only the Oxfam curricula provided authentic opportunities to enact sociopolitical change. The NEED curricula suppressed the development of critical pedagogy. From these findings, the following conclusions were drawn. When socioscientific issues are presented with the development of scientific literacy and critical pedagogy, the curricula allow students to develop fact-based opinions about the issue. However, curricula that address socioscientific issues without the inclusion of these skills minimize the significance of the issue and normalize the hegemonic worldview promoted by the curricula's authors. Based on these findings

  11. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  12. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  13. Competing Discourses of Scientific Identity among Postdoctoral Scholars in the Biomedical Sciences.

    PubMed

    Price, Rebecca M; Kantrowitz-Gordon, Ira; Gordon, Sharona E

    2018-06-01

    The postdoctoral period is generally one of low pay, long hours, and uncertainty about future career options. To better understand how postdocs conceive of their present and future goals, we asked researchers about their scientific identities while they were in their postdoctoral appointments. We used discourse analysis to analyze interviews with 30 scholars from a research-intensive university or nearby research institutions to better understand how their scientific identities influenced their career goals. We identified two primary discourses: bench scientist and principal investigator (PI). The bench scientist discourse is characterized by implementing other people's scientific visions through work in the laboratory and expertise in experimental design and troubleshooting. The PI discourse is characterized by a focus on formulating scientific visions, obtaining funding, and disseminating results through publishing papers and at invited talks. Because these discourses represent beliefs, they can-and do-limit postdocs' understandings of what career opportunities exist and the transferability of skills to different careers. Understanding the bench scientist and PI discourses, and how they interact, is essential for developing and implementing better professional development programs for postdocs.

  14. Web-Based Water Accounting Scenario Platform to Address Uncertainties in Water Resources Management in the Mekong : A Case Study in Ca River Basin, Vietnam

    NASA Astrophysics Data System (ADS)

    Apirumanekul, C.; Purkey, D. R.; Pudashine, J.; Seifollahi-Aghmiuni, S.; Wang, D.; Ate, P.; Meechaiya, C.

    2017-12-01

    Rapid economic development in the Mekong Region is placing pressure on environmental resources. Uncertain changes in land-use, increasing urbanization, infrastructure development, migration patterns and climate risks s combined with scarce water resources are increasing water demand in various sectors. More appropriate policies, strategies and planning for sustainable water resource management are urgently needed. Over the last five years, Vietnam has experienced more frequent and intense droughts affecting agricultural and domestic water use during the dry season. The Ca River Basin is the third largest river basin in Vietnam with 35% of its area located in Lao PDR. The delta landscape comprises natural vegetation, forest, paddy fields, farming and urban areas. The Ca River Basin is experiencing ongoing water scarcity that impacts on crop production, farming livelihoods and household water consumption. Water scarcity is exacerbated by uncertainties in policy changes (e.g. changes in land-use, crop types), basin development (e.g. reservoir construction, urban expansion), and climate change (e.g. changes in rainfall patterns and onset of monsoon). The Water Evaluation And Planning (WEAP) model, with inputs from satellite-based information and institutional data, is used to estimate water supply, water use and water allocation in various sectors (e.g. household, crops, irrigation and flood control) under a wide range of plausible future scenarios in the Ca River Basin. Web-Based Water Allocation Scenario Platform is an online implementation of WEAP model structured in terms of a gaming experience. The online game, as an educational tool, helps key agencies relevant to water resources management understand and explore the complexity of integrated system of river basin under a wide range of scenarios. Performance of the different water resources strategies in Ca River Basin (e.g. change of dam operation to address needs in various sectors, construction of dams, changes

  15. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  16. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  17. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  18. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  19. Full uncertainty quantification of N2O and NO emissions using the biogeochemical model LandscapeDNDC on site and regional scale

    NASA Astrophysics Data System (ADS)

    Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus

    2017-04-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully

  20. Addressing chronic operational issues at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    Nordin, Tom; Matsuda, Richard

    2016-07-01

    The W. M. Keck Observatory (WMKO) has a good track record at addressing large critical faults which impact observing. Our performance tracking and correcting chronic minor faults has been mixed, yet this class of problems has a significant negative impact on scientific productivity and staff effectiveness. We have taken steps to address this shortcoming. This paper outlines the creation of a program to identify, categorize and rank these chronic operational issues, track them over time, and develop management options for their resolution. The success of the program at identifying these chronic operational issues and the advantages of dedicating observatory resources to this endeavor are presented.

  1. 76 FR 69712 - U.S. Air Force Scientific Advisory Board; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ...: Meeting notice. SUMMARY: Due to difficulties, beyond the control of the U.S. Air Force Scientific Advisory... Scientific Advisory Board should submit a written statement in accordance with 41 CFR 102-3.140(c) and.... Written statements can be submitted to the Designated Federal Officer at the address detailed below at any...

  2. Theory of Multiple Intelligences: Is It a Scientific Theory?

    ERIC Educational Resources Information Center

    Chen, Jie-Qi

    2004-01-01

    This essay discusses the status of multiple intelligences (MI) theory as a scientific theory by addressing three issues: the empirical evidence Gardner used to establish MI theory, the methodology he employed to validate MI theory, and the purpose or function of MI theory.

  3. The Influence of Group Dynamics on Collaborative Scientific Argumentation

    ERIC Educational Resources Information Center

    Ryu, Suna; Sandoval, William A.

    2015-01-01

    Research has addressed what instructional conditions may inhibit or promote scientific argumentation. Little research, however, has paid attention to interpersonal factors that influence collaborative argumentation. The present study examines the ways interpersonal factors affected group dynamics, which influence the features of collaborative…

  4. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  5. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    PubMed

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  6. African Scientific Network: A model to enhance scientific research in developing countries

    NASA Astrophysics Data System (ADS)

    Kebede, Abebe

    2002-03-01

    Africa has over 350 higher education institutions with a variety of experiences and priorities. The primary objectives of these institutions are to produce white-collar workers, teachers, and the work force for mining, textiles, and agricultural industries. The state of higher education and scientific research in Africa have been discussed in several conferences. The proposals that are generated by these conferences advocate structural changes in higher education, North-South institutional linkages, mobilization of the African Diaspora and funding. We propose a model African Scientific Network that would facilitate and enhance international scientific partnerships between African scientists and their counterparts elsewhere. A recent article by James Lamout (Financial Times, August 2, 2001) indicates that emigration from South Africa alone costs $8.9 billion in lost human resources. The article also stated that every year 23,000 graduates leave Africa for opportunities overseas, mainly in Europe, leaving only 20,000 scientists and engineers serving over 600 million people. The International Organization for Migration states that the brain drain of highly skilled professionals from Africa is making economic growth and poverty alleviation impossible across the continent. In our model we will focus on a possible networking mechanism where the African Diaspora will play a major role in addressing the financial and human resources needs of higher education in Africa

  7. Some reflections on the role of the Scientific Advisory Panel to the Marshall Islands nationwide radiological study.

    PubMed

    McEwan, A C; Simon, S L; Baverstock, K F; Trott, K R; Sankaranarayanan, K; Paretzke, H G

    1997-07-01

    As a consequence of the U.S. Atomic Weapons Testing Program in the Trust Territory of the Pacific, now the Republic of the Marshall Islands, numerous scientists have advised the Marshallese on matters of radiation and radioactive contamination. Some of the previous advice has appeared to vary or conflict resulting in consequent uncertainty for the people. In a new initiative in 1989, the RMI Government engaged a five member multi-disciplinary Scientific Advisory Panel to oversee the assessment of, and to advise on, the radiological status of the entire nation. The formation of the Panel was accompanied by the establishment of a Resident Scientist position, and ultimately a small scientific team and laboratory on Majuro. The nationwide radiological study was conducted using ground survey methods over the period 1990-1994. Tasks undertaken by the Panel included formulating reasonable objectives for the study and attempting to establish effective communication and understanding of issues with political leaders and RMI Government agencies and people, as well as advising on and monitoring the scientific integrity of the study itself. The attempt was also made to initiate investigations to address matters of concern that emerged. The problem was faced of providing not only technical guidance on radioactivity and radiation measurements, but also explaining the significance of measured values and concepts, such as risk and probability of health effects to a diverse but nontechnical audience, generally across cultural and language barriers. The experience of the Panel in providing advice and guidance to the Republic of the Marshall Islands, while unique in many ways, parallels the difficulties experienced elsewhere in communicating information about risks from radiation exposure.

  8. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  9. Protecting Traditional Knowledge Related to Biological Resources: Is Scientific Research Going to Become More Bureaucratized?

    PubMed Central

    Reddy, Prashant; Lakshmikumaran, Malathi

    2015-01-01

    For the past several decades, there has been a world debate on the need for protecting traditional knowledge. A global treaty appears to be a distant reality. Of more immediate concern are the steps taken by the global community to protect access to biological resources in the name of protecting traditional knowledge. The Indian experience with implementing the Convention on Biological Diversity has created substantial legal uncertainty in collaborative scientific research between Indians and foreigners apart from bureaucratizing the entire process of scientific research, especially with regard to filing of applications for intellectual property rights. The issue therefore is whether the world needs to better balance the needs of the scientific community with the rights of those who have access to traditional knowledge. PMID:26101205

  10. Addressing Emerging Risks: Scientific and Regulatory Challenges Associated with Environmentally Persistent Free Radicals.

    PubMed

    Dugas, Tammy R; Lomnicki, Slawomir; Cormier, Stephania A; Dellinger, Barry; Reams, Margaret

    2016-06-08

    Airborne fine and ultrafine particulate matter (PM) are often generated through widely-used thermal processes such as the combustion of fuels or the thermal decomposition of waste. Residents near Superfund sites are exposed to PM through the inhalation of windblown dust, ingestion of soil and sediments, and inhalation of emissions from the on-site thermal treatment of contaminated soils. Epidemiological evidence supports a link between exposure to airborne PM and an increased risk of cardiovascular and pulmonary diseases. It is well-known that during combustion processes, incomplete combustion can lead to the production of organic pollutants that can adsorb to the surface of PM. Recent studies have demonstrated that their interaction with metal centers can lead to the generation of a surface stabilized metal-radical complex capable of redox cycling to produce ROS. Moreover, these free radicals can persist in the environment, hence their designation as Environmentally Persistent Free Radicals (EPFR). EPFR has been demonstrated in both ambient air PM2.5 (diameter < 2.5 µm) and in PM from a variety of combustion sources. Thus, low-temperature, thermal treatment of soils can potentially increase the concentration of EPFR in areas in and around Superfund sites. In this review, we will outline the evidence to date supporting EPFR formation and its environmental significance. Furthermore, we will address the lack of methodologies for specifically addressing its risk assessment and challenges associated with regulating this new, emerging contaminant.

  11. Addressing Emerging Risks: Scientific and Regulatory Challenges Associated with Environmentally Persistent Free Radicals

    PubMed Central

    Dugas, Tammy R.; Lomnicki, Slawomir; Cormier, Stephania A.; Dellinger, Barry; Reams, Margaret

    2016-01-01

    Airborne fine and ultrafine particulate matter (PM) are often generated through widely-used thermal processes such as the combustion of fuels or the thermal decomposition of waste. Residents near Superfund sites are exposed to PM through the inhalation of windblown dust, ingestion of soil and sediments, and inhalation of emissions from the on-site thermal treatment of contaminated soils. Epidemiological evidence supports a link between exposure to airborne PM and an increased risk of cardiovascular and pulmonary diseases. It is well-known that during combustion processes, incomplete combustion can lead to the production of organic pollutants that can adsorb to the surface of PM. Recent studies have demonstrated that their interaction with metal centers can lead to the generation of a surface stabilized metal-radical complex capable of redox cycling to produce ROS. Moreover, these free radicals can persist in the environment, hence their designation as Environmentally Persistent Free Radicals (EPFR). EPFR has been demonstrated in both ambient air PM2.5 (diameter < 2.5 µm) and in PM from a variety of combustion sources. Thus, low-temperature, thermal treatment of soils can potentially increase the concentration of EPFR in areas in and around Superfund sites. In this review, we will outline the evidence to date supporting EPFR formation and its environmental significance. Furthermore, we will address the lack of methodologies for specifically addressing its risk assessment and challenges associated with regulating this new, emerging contaminant. PMID:27338429

  12. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    PubMed

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  13. Information transduction capacity reduces the uncertainties in annotation-free isoform discovery and quantification

    PubMed Central

    Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong

    2017-01-01

    Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101

  14. Cosmic Dust Collection Facility: Scientific objectives and programmatic relations

    NASA Technical Reports Server (NTRS)

    Hoerz, Fred (Editor); Brownlee, D. E.; Bunch, T. E.; Grounds, D.; Grun, E.; Rummel, Y.; Quaide, W. L.; Walker, R. M.

    1990-01-01

    The science objectives are summarized for the Cosmic Dust Collection Facility (CDCF) on Space Station Freedom and these objectives are related to ongoing science programs and mission planning within NASA. The purpose is to illustrate the potential of the CDCF project within the broad context of early solar system sciences that emphasize the study of primitive objects in state-of-the-art analytical and experimental laboratories on Earth. Current knowledge about the sources of cosmic dust and their associated orbital dynamics is examined, and the results are reviewed of modern microanalytical investigations of extraterrestrial dust particles collected on Earth. Major areas of scientific inquiry and uncertainty are identified and it is shown how CDCF will contribute to their solution. General facility and instrument concepts that need to be pursued are introduced, and the major development tasks that are needed to attain the scientific objectives of the CDCF project are identified.

  15. West Nile Virus workshop: scientific considerations for tissue donors.

    PubMed

    Brubaker, Scott A; Robert Rigney, P

    2012-08-01

    This report contains selected excerpts, presented as a summary, from a public workshop sponsored by the American Association of Tissue Banks (AATB) held to discuss West Nile Virus (WNV) and scientific considerations for tissue donors. The daylong workshop was held 9 July 2010 at the Ritz-Carlton Hotel at Tyson's Corner in McLean, Virginia, United States (U.S.). The workshop was designed to determine and discuss scientific information that is known, and what is not known, regarding WNV infection and transmission. The goal is to determine how to fill gaps in knowledge of WNV and tissue donation and transplantation by pursuing relevant scientific studies. This information should ultimately support decisions leading to appropriate tissue donor screening and testing considerations. Discussion topics were related to identifying these gaps and determining possible solutions. Workshop participants included subject-matter experts from the U.S. Food and Drug Administration, the Centers for Disease Control and Prevention, U.S. Department of Health and Human Services, Health Canada, the Public Health Agency of Canada, AATB-accredited tissue banks including reproductive tissue banks, accredited eye banks of the Eye Bank Association of America, testing laboratories, and infectious disease and organ transplantation professionals. After all presentations concluded, a panel addressed this question: "What are the scientific considerations for tissue donors and what research could be performed to address those considerations?" The slide presentations from the workshop are available at: http://www.aatb.org/2010-West-Nile-Virus-Workshop-Presentations.

  16. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty

  17. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    PubMed

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate

  18. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  19. HIT or miss: the application of health care information technology to managing uncertainty in clinical decision making.

    PubMed

    Kazandjian, Vahé A; Lipitz-Snyderman, Allison

    2011-12-01

    To discuss the usefulness of health care information technology (HIT) in assisting care providers minimize uncertainty while simultaneously increasing efficiency of the care provided. An ongoing study of HIT, performance measurement (clinical and production efficiency) and their implications to the payment for care represents the design of this study. Since 2006, all Maryland hospitals have embarked on a multi-faceted study of performance measures and HIT adoption surveys, which will shape the health care payment model in Maryland, the last of the all-payor states, in 2011. This paper focuses on the HIT component of the Maryland care payment initiative. While the payment model is still under review and discussion, 'appropriateness' of care has been discussed as an important dimension of measurement. Within this dimension, the 'uncertainty' concept has been identified as associated with variation in care practices. Hence, the methods of this paper define how HIT can assist care providers in addressing the concept of uncertainty, and then provides findings from the first HIT survey in Maryland to infer the readiness of Maryland hospital in addressing uncertainty of care in part through the use of HIT. Maryland hospitals show noteworthy variation in their adoption and use of HIT. While computerized, electronic patient records are not commonly used among and across Maryland hospitals, many of the uses of HIT internally in each hospital could significantly assist in better communication about better practices to minimize uncertainty of care and enhance the efficiency of its production. © 2010 Blackwell Publishing Ltd.

  20. Dealing with Uncertainty in Water Management: Finding the Right Balance Between Risk and Opportunity to Build Trust and Create Value

    NASA Astrophysics Data System (ADS)

    Islam, S.; Susskind, L.

    2012-12-01

    Most difficulties in water management are the product of rigid assumptions about how water ought to be allocated in the face of ever-increasing demand and growing uncertainty. When stakeholders face contending water claims, one of the biggest obstacles to reaching agreement is uncertainty. Specifically, there are three types of uncertainty that need to be addressed: uncertainty of information, uncertainty of action and uncertainty of perception. All three shape water management decisions. Contrary to traditional approaches, we argue that management of uncertainty needs to include both risks and opportunities. When parties treat water as a flexible rather than a fixed resource, opportunities to create value can be invented. When they use the right processes and mechanisms to enhance trust, even parties in conflict can reach agreements that satisfy their competing water needs and interests simultaneously. Using examples from several boundary crossing water cases we will show how this balance between risks and opportunities can be found to manage water resources for an uncertain future.