Science.gov

Sample records for address scientific uncertainties

  1. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  2. Optimal regeneration planning for old-growth forest: addressing scientific uncertainty in endangered species recovery through adaptive management

    USGS Publications Warehouse

    Moore, C.T.; Conroy, M.J.

    2006-01-01

    Stochastic and structural uncertainties about forest dynamics present challenges in the management of ephemeral habitat conditions for endangered forest species. Maintaining critical foraging and breeding habitat for the endangered red-cockaded woodpecker (Picoides borealis) requires an uninterrupted supply of old-growth forest. We constructed and optimized a dynamic forest growth model for the Piedmont National Wildlife Refuge (Georgia, USA) with the objective of perpetuating a maximum stream of old-growth forest habitat. Our model accommodates stochastic disturbances and hardwood succession rates, and uncertainty about model structure. We produced a regeneration policy that was indexed by current forest state and by current weight of evidence among alternative model forms. We used adaptive stochastic dynamic programming, which anticipates that model probabilities, as well as forest states, may change through time, with consequent evolution of the optimal decision for any given forest state. In light of considerable uncertainty about forest dynamics, we analyzed a set of competing models incorporating extreme, but plausible, parameter values. Under any of these models, forest silviculture practices currently recommended for the creation of woodpecker habitat are suboptimal. We endorse fully adaptive approaches to the management of endangered species habitats in which predictive modeling, monitoring, and assessment are tightly linked.

  3. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  4. Programmatic methods for addressing contaminated volume uncertainties.

    SciTech Connect

    DURHAM, L.A.; JOHNSON, R.L.; RIEMAN, C.R.; SPECTOR, H.L.; Environmental Science Division; U.S. ARMY CORPS OF ENGINEERS BUFFALO DISTRICT

    2007-01-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in predesign data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in predesign characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland1, Ashland2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate predesign contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District.

  5. Programmatic methods for addressing contaminated volume uncertainties

    SciTech Connect

    Rieman, C.R.; Spector, H.L.; Durham, L.A.; Johnson, R.L.

    2007-07-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The U.S. Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in pre-design data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in pre-design characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland 1, Ashland 2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate pre-design contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District. (authors)

  6. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  7. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  8. Addressing submarine geohazards through scientific drilling

    NASA Astrophysics Data System (ADS)

    Camerlenghi, A.

    2009-04-01

    Natural submarine geohazards (earthquakes, volcanic eruptions, landslides, volcanic island flank collapses) are geological phenomena originating at or below the seafloor leading to a situation of risk for off-shore and on-shore structures and the coastal population. Addressing submarine geohazards means understanding their spatial and temporal variability, the pre-conditioning factors, their triggers, and the physical processes that control their evolution. Such scientific endeavour is nowadays considered by a large sector of the international scientific community as an obligation in order to contribute to the mitigation of the potentially destructive societal effects of submarine geohazards. The study of submarine geohazards requires a multi-disciplinary scientific approach: geohazards must be studied through their geological record; active processes must be monitored; geohazard evolution must be modelled. Ultimately, the information must be used for the assessment of vulnerability, risk analysis, and development of mitigation strategies. In contrast with the terrestrial environment, the oceanic environment is rather hostile to widespread and fast application of high-resolution remote sensing techniques, accessibility for visual inspection, sampling and installation of monitoring stations. Scientific Drilling through the IODP (including the related pre site-survey investigations, sampling, logging and in situ measurements capability, and as a platform for deployment of long term observatories at the surface and down-hole) can be viewed as the centre of gravity of an international, coordinated, multi-disciplinary scientific approach to address submarine geohazards. The IODP Initial Science Plan expiring in 2013 does not address openly geohazards among the program scientific objectives. Hazards are referred to mainly in relation to earthquakes and initiatives towards the understanding of seismogenesis. Notably, the only drilling initiative presently under way is the

  9. Addressing submarine geohazards through scientific drilling

    NASA Astrophysics Data System (ADS)

    Camerlenghi, A.

    2009-04-01

    Natural submarine geohazards (earthquakes, volcanic eruptions, landslides, volcanic island flank collapses) are geological phenomena originating at or below the seafloor leading to a situation of risk for off-shore and on-shore structures and the coastal population. Addressing submarine geohazards means understanding their spatial and temporal variability, the pre-conditioning factors, their triggers, and the physical processes that control their evolution. Such scientific endeavour is nowadays considered by a large sector of the international scientific community as an obligation in order to contribute to the mitigation of the potentially destructive societal effects of submarine geohazards. The study of submarine geohazards requires a multi-disciplinary scientific approach: geohazards must be studied through their geological record; active processes must be monitored; geohazard evolution must be modelled. Ultimately, the information must be used for the assessment of vulnerability, risk analysis, and development of mitigation strategies. In contrast with the terrestrial environment, the oceanic environment is rather hostile to widespread and fast application of high-resolution remote sensing techniques, accessibility for visual inspection, sampling and installation of monitoring stations. Scientific Drilling through the IODP (including the related pre site-survey investigations, sampling, logging and in situ measurements capability, and as a platform for deployment of long term observatories at the surface and down-hole) can be viewed as the centre of gravity of an international, coordinated, multi-disciplinary scientific approach to address submarine geohazards. The IODP Initial Science Plan expiring in 2013 does not address openly geohazards among the program scientific objectives. Hazards are referred to mainly in relation to earthquakes and initiatives towards the understanding of seismogenesis. Notably, the only drilling initiative presently under way is the

  10. Presidential address: Experimenting with the scientific past.

    PubMed

    Radick, Gregory

    2016-06-01

    When it comes to knowledge about the scientific pasts that might have been - the so-called 'counterfactual' history of science - historians can either debate its possibility or get on with the job. Taking the latter course means re-engaging with some of the most general questions about science. It can also lead to fresh insights into why particular episodes unfolded as they did and not otherwise. Drawing on recent research into the controversy over Mendelism in the early twentieth century, this address reports and reflects on a novel teaching experiment conducted in order to find out what biology and its students might be like now had the controversy gone differently. The results suggest a number of new options: for the collection of evidence about the counterfactual scientific past, for the development of collaborations between historians of science and science educators, for the cultivation of more productive relationships between scientists and their forebears, and for heightened self-awareness about the curiously counterfactual business of being historical. PMID:27353945

  11. Addressing contrasting cognitive models in scientific collaboration

    NASA Astrophysics Data System (ADS)

    Diviacco, P.

    2012-04-01

    If the social aspects of scientific communities and their internal dynamics is starting to be recognized and acknowledged in the everyday lives of scientists, it is rather difficult for them to find tools that could support their activities consistently with this perspective. Issues span from gathering researchers to mutual awareness, from information sharing to building meaning, with the last one being particularly critical in research fields as the geo-sciences, that deal with the reconstruction of unique, often non-reproducible, and contingent processes. Reasoning here is, in fact, mainly abductive, allowing multiple and concurrent explanations for the same phenomenon to coexist. Scientists bias one hypothesis over another not only on strictly logical but also on sociological motivations. Following a vision, scientists tend to evolve and isolate themselves from other scientists creating communities characterized by different cognitive models, so that after some time these become incompatible and scientists stop understanding each other. We address these problems as a communication issue so that the classic distinction into three levels (syntactic, semantic and pragmatic) can be used. At the syntactic level, we highlight non-technical obstacles that condition interoperability and data availability and transparency. At the semantic level, possible incompatibilities of cognitive models are particularly evident, so that using ontologies, cross-domain reconciliation should be applied. This is a very difficult task to perform since the projection of knowledge by scientists, in the designated community, is political and thus can create a lot of tension. The strategy we propose to overcome these issues pertains to pragmatics, in the sense that it is intended to acknowledge the cultural and personal factors each partner brings into the collaboration and is based on the idea that meaning should remain a flexible and contingent representation of possibly divergent views

  12. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  13. Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects

    SciTech Connect

    Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart

    2005-03-01

    Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration

  14. Scientific Uncertainty and Its Relevance to Science Education

    ERIC Educational Resources Information Center

    Ruggeri, Nancy Lee

    2011-01-01

    Uncertainty is inherent to scientific methods and practices, yet is it rarely explicitly discussed in science classrooms. Ironically, science is often equated with "certainty" in these contexts. Uncertainties that arise in science deserve special attention, as they are increasingly a part of public discussions and are susceptible to manipulation.…

  15. Communication about scientific uncertainty in environmental nanoparticle research - a comparison of scientific literature and mass media

    NASA Astrophysics Data System (ADS)

    Heidmann, Ilona; Milde, Jutta

    2014-05-01

    The research about the fate and behavior of engineered nanoparticles in the environment is despite its wide applications still in the early stages. 'There is a high level of scientific uncertainty in nanoparticle research' is often stated in the scientific community. Knowledge about these uncertainties might be of interest to other scientists, experts and laymen. But how could these uncertainties be characterized and are they communicated within the scientific literature and the mass media? To answer these questions, the current state of scientific knowledge about scientific uncertainty through the example of environmental nanoparticle research was characterized and the communication of these uncertainties within the scientific literature is compared with its media coverage in the field of nanotechnologies. The scientific uncertainty within the field of environmental fate of nanoparticles is by method uncertainties and a general lack of data concerning the fate and effects of nanoparticles and their mechanisms in the environment, and by the uncertain transferability of results to the environmental system. In the scientific literature, scientific uncertainties, their sources, and consequences are mentioned with different foci and to a different extent. As expected, the authors in research papers focus on the certainty of specific results within their specific research question, whereas in review papers, the uncertainties due to a general lack of data are emphasized and the sources and consequences are discussed in a broader environmental context. In the mass media, nanotechnology is often framed as rather certain and positive aspects and benefits are emphasized. Although reporting about a new technology, only in one-third of the reports scientific uncertainties are mentioned. Scientific uncertainties are most often mentioned together with risk and they arise primarily from unknown harmful effects to human health. Environmental issues itself are seldom mentioned

  16. Addressing Unconscious Bias: Steps toward an Inclusive Scientific Culture

    NASA Astrophysics Data System (ADS)

    Stewart, Abigail

    2011-01-01

    In this talk I will outline the nature of unconscious bias, as it operates to exclude or marginalize some participants in the scientific community. I will show how bias results from non-conscious expectations about certain groups of people, including scientists and astronomers. I will outline scientific research in psychology, sociology and economics that has identified the impact these expectations have on interpersonal judgments that are at the heart of assessment of individuals' qualifications. This research helps us understand not only how bias operates within a single instance of evaluation, but how evaluation bias can accumulate over a career if not checked, creating an appearance of confirmation of biased expectations. Some research has focused on how best to interrupt and mitigate unconscious bias, and many institutions--including the University of Michigan--have identified strategic interventions at key points of institutional decision-making (particularly hiring, annual review, and promotion) that can make a difference. The NSF ADVANCE Institutional Transformation program encouraged institutions to draw on the social science literature to create experimental approaches to addressing unconscious bias. I will outline four approaches to intervention that have arisen through the ADVANCE program: (1) systematic education that increases awareness among decisionmakers of how evaluation bias operates; (2) development of practices that mitigate the operation of bias even when it is out of conscious awareness; (3) creation of institutional policies that routinize and sanction these practices; and (4) holding leaders accountable for these implementation of these new practices and policies. Although I will focus on ways to address unconscious bias within scientific institutions (colleges and universities, laboratories and research centers, etc.), I will close by considering how scientific organizations can address unconscious bias and contribute to creating an

  17. `Scientific uncertainty` scuttles new acid rain standard

    SciTech Connect

    Renner, R.

    1995-10-01

    An EPA report to Congress due this month will report on the controversial question of whether the Clean Air Act Amendments of 1990 (CAAA) adequately protect sensitive areas of the United States from acid rain, and recommends against establishing a new `acid deposition standard` to protect sensitive areas of the United States from acid rain. Rebecca Renner reports on the scientific issues underlying that decision and the efforts of one state to overturn it. The report to Congress, required by the CAAA, asked the Agency to report on the feasibility of setting an acid deposition standard to protect sensitive areas. EPA missed the original 1993 deadline and is under court order to issue the final report by October 15. The draft report identifies the lakes and streams of the Appalachian mountains as sensitive resources that receive damaging concentrations of acidic deposition. Three areas where sensitive water resources have been well studied were selected as providing the best available data for modeling case studies: the Adirondacks; the mid-Appalachian region, including parts of Pennsylvania. West Virginia, Maryland, and Virginia; and the Southern Blue Ridge in Tennessee, North Carolina, and Georgia. Results are discussed. 6 refs.

  18. Flood Risk, Uncertainty, and Scientific Information for Decision Making: Lessons from an Interdisciplinary Project.

    NASA Astrophysics Data System (ADS)

    Morss, Rebecca E.; Wilhelmi, Olga V.; Downton, Mary W.; Gruntfest, Eve

    2005-11-01

    The magnitude of flood damage in the United States, combined with the uncertainty in current estimates of flood risk, suggest that society could benefit from improved scientific information about flood risk. To help address this perceived need, a group of researchers initiated an interdisciplinary study of climate variability, scientific uncertainty, and hydrometeorological information for flood-risk decision making, focused on Colorado's Rocky Mountain Front Range urban corridor. We began by investigating scientific research directions that were likely to benefit flood-risk estimation and management, through consultation with climatologists, hydrologists, engineers, and planners. In doing so, we identified several challenges involved in generating new scientific information to aid flood management in the presence of significant scientific and societal uncertainty. This essay presents lessons learned from this study, along with our observations on the complex interactions among scientific information, uncertainty, and societal decision making. It closes by proposing a modification to the "end to end" approach to conducting societally relevant scientific research. Although we illustrate points using examples from flood management, the concepts may be applicable to other arenas, such as global climate change.


  19. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    SciTech Connect

    Cooke, Roger; MacDonell, Margaret

    2007-07-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  20. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    NASA Astrophysics Data System (ADS)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  1. Addressing uncertainty in fecal indicator bacteria dark inactivation rates.

    PubMed

    Gronewold, Andrew D; Myers, Luke; Swall, Jenise L; Noble, Rachel T

    2011-01-01

    Assessing the potential threat of fecal contamination in surface water often depends on model forecasts which assume that fecal indicator bacteria (FIB, a proxy for the concentration of pathogens found in fecal contamination from warm-blooded animals) are lost or removed from the water column at a certain rate (often referred to as an "inactivation" rate). In efforts to reduce human health risks in these water bodies, regulators enforce limits on easily-measured FIB concentrations, commonly reported as most probable number (MPN) and colony forming unit (CFU) values. Accurate assessment of the potential threat of fecal contamination, therefore, depends on propagating uncertainty surrounding "true" FIB concentrations into MPN and CFU values, inactivation rates, model forecasts, and management decisions. Here, we explore how empirical relationships between FIB inactivation rates and extrinsic factors might vary depending on how uncertainty in MPN values is expressed. Using water samples collected from the Neuse River Estuary (NRE) in eastern North Carolina, we compare Escherichia coli (EC) and Enterococcus (ENT) dark inactivation rates derived from two statistical models of first-order loss; a conventional model employing ordinary least-squares (OLS) regression with MPN values, and a novel Bayesian model utilizing the pattern of positive wells in an IDEXX Quanti-Tray®/2000 test. While our results suggest that EC dark inactivation rates tend to decrease as initial EC concentrations decrease and that ENT dark inactivation rates are relatively consistent across different ENT concentrations, we find these relationships depend upon model selection and model calibration procedures. We also find that our proposed Bayesian model provides a more defensible approach to quantifying uncertainty in microbiological assessments of water quality than the conventional MPN-based model, and that our proposed model represents a new strategy for developing robust relationships between

  2. Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates

    EPA Science Inventory

    Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...

  3. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... calculating probability of causation estimates at 42 CFR 81. In this way, claimants will receive the benefit... 42 Public Health 1 2011-10-01 2011-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will...

  4. Addressing the Uncertainty in Prescribing High Flows for River Restoration

    NASA Astrophysics Data System (ADS)

    Downs, P. W.; Sklar, L.; Braudrick, C. A.

    2002-12-01

    Flow prescriptions for environmental benefit in regulated rivers are commonly focused on the provision of minimum flow depths to achieve fish passage and holding habitat objectives. Assessment of these flows can be achieved readily and with reasonable confidence by using low-flow hydrological records and channel morphology data in combination with one dimensional hydraulic modeling. More recently, as understanding has increased of the critical role played by high flows in maintaining a wide range of habitats for instream and riparian flora and fauna, attention has turned to prescribing high flows to invoke the geomorphic processes that maintain suitable habitat niches. Prediction of the effects of these flows may require high-flow discharge and sediment transport data, high resolution topographic data, hydraulic and sediment transport modeling (often in two or three spatial dimensions), knowledge of the watershed historical context, and an understanding of the thresholds for channel morphological change. Not surprisingly, the associated level of uncertainty in this analysis increases tremendously. High flows are defined by a combination of magnitude, frequency, timing and duration parameters and their impact varies according to antecedent events. High flow bedload sediment transport records are rare, sediment transport equations are reliable usually to only an order of magnitude, practical applications of two and three-dimensional sediment transport models are in their infancy, the watershed historical record may be patchy with the link between cause and effect difficult to ascertain, and thresholds of channel morphological change are poorly understood. As the first step in reducing uncertainty, it is essential to state precisely the ecological target objectives of prescribed high flows, and to link these objectives to the hydraulic and geomorphic thresholds to be achieved or exceeded. Such thresholds provide the basis for a systematic classification of high flows

  5. Addressing sources of uncertainty in a global terrestrial carbon model

    NASA Astrophysics Data System (ADS)

    Exbrayat, J.; Pitman, A. J.; Zhang, Q.; Abramowitz, G.; Wang, Y.

    2013-12-01

    Several sources of uncertainty exist in the parameterization of the land carbon cycle in current Earth System Models (ESMs). For example, recently implemented interactions between the carbon (C), nitrogen (N) and phosphorus (P) cycles lead to diverse changes in land-atmosphere C fluxes simulated by different models. Further, although soil organic matter decomposition is commonly parameterized as a first-order decay process, the formulation of the microbial response to changes in soil moisture and soil temperature varies tremendously between models. Here, we examine the sensitivity of historical land-atmosphere C fluxes simulated by an ESM to these two major sources of uncertainty. We implement three soil moisture (SMRF) and three soil temperature (STRF) respiration functions in the CABLE-CASA-CNP land biogeochemical component of the coarse resolution CSIRO Mk3L climate model. Simulations are undertaken using three degrees of biogeochemical nutrient limitation: C-only, C and N, and C and N and P. We first bring all 27 possible combinations of a SMRF with a STRF and a biogeochemical mode to a steady-state in their biogeochemical pools. Then, transient historical (1850-2005) simulations are driven by prescribed atmospheric CO2 concentrations used in the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Similarly to some previously published results, representing N and P limitation on primary production reduces the global land carbon sink while some regions become net C sources over the historical period (1850-2005). However, the uncertainty due to the SMRFs and STRFs does not decrease relative to the inter-annual variability in net uptake when N and P limitations are added. Differences in the SMRFs and STRFs and their effect on the soil C balance can also change the sign of some regional sinks. We show that this response is mostly driven by the pool size achieved at the end of the spin-up procedure. Further, there exists a six-fold range in the level

  6. Assessing and Addressing Students' Scientific Literacy Needs in Physical Geology

    NASA Astrophysics Data System (ADS)

    Campbell-Stone, E. A.; Myers, J. D.

    2005-12-01

    Exacting excellence equally from university students around the globe can be accomplished by providing all students with necessary background tools to achieve mastery of their courses, even if those tools are not part of normal content. As instructors we hope to see our students grasp the substance of our courses, make mental connections between course material and practical applications, and use this knowledge to make informed decisions as citizens. Yet many educators have found that students enter university-level introductory courses in mathematics, science and engineering without adequate academic preparation. As part of a FIPSE-funded project at the University of Wyoming, the instructors of the Physical Geology course have taken a new approach to tackling the problem of lack of scientific/mathematic skills in incoming students. Instead of assuming that students should already know or will learn these skills on their own, they assess students' needs and provide them the opportunity to master scientific literacies as they learn geologic content. In the introductory geology course, instructors identified two categories of literacies, or basic skills that are necessary for academic success and citizen participation. Fundamental literacies include performing simple quantitative calculations, making qualitative assessments, and reading and analyzing tables and graphs. Technical literacies are those specific to understanding geology, and comprise the ability to read maps, visualize changes through time, and conceptualize in three dimensions. Because these skills are most easily taught in lab, the in-house lab manual was rewritten to be both literacy- and content-based. Early labs include simple exercises addressing literacies in the context of geological science, and each subsequent lab repeats exposure to literacies, but at increasing levels of difficulty. Resources available to assist students with literacy mastery include individual instruction, a detailed

  7. Addressing scientific literacy through content area reading and processes of scientific inquiry: What teachers report

    NASA Astrophysics Data System (ADS)

    Cooper, Susan J.

    The purpose of this study was to interpret the experiences of secondary science teachers in Florida as they address the scientific literacy of their students through teaching content reading strategies and student inquiry skills. Knowledge of the successful integration of content reading and inquiry skills by experienced classroom teachers would be useful to many educators as they plan instruction to achieve challenging state and national standards for reading as well as science. The problem was investigated using grounded theory methodology. Open-ended questions were asked in three focus groups and six individual interviews that included teachers from various Florida school districts. The constant comparative approach was used to analyze the data. Initial codes were collapsed into categories to determine the conceptual relationships among the data. From this, the five core categories were determined to be Influencers, Issues, Perceptions, Class Routines, and Future Needs. These relate to the central phenomenon, Instructional Modifications, because teachers often described pragmatic and philosophical changes in their teaching as they deliberated to meet state standards in both reading and science. Although Florida's secondary science teachers have been asked to incorporate content reading strategies into their science instruction for the past several years, there was limited evidence of using these strategies to further student understanding of scientific processes. Most teachers saw little connection between reading and inquiry, other than the fact that students must know how to read to follow directions in the lab. Scientific literacy, when it was addressed by teachers, was approached mainly through class discussions, not reading. Teachers realized that students cannot learn secondary science content unless they read science text with comprehension; therefore the focus of reading instruction was on learning science content, not scientific literacy or student

  8. Addressing STEM Retention through a Scientific Thought and Methods Course

    ERIC Educational Resources Information Center

    Koenig, Kathleen; Schen, Melissa; Edwards, Michael; Bao, Lei

    2012-01-01

    Retention of majors in science, technology, engineering, and mathematics (STEM) is a national problem that continues to be the focus of bridging and first-year experience programs. This article presents an innovative course, Scientific Thought and Methods, that targets students with low math placement scores. These students are not eligible for…

  9. PAPERS ADDRESSING SCIENTIFIC ISSUES IN THE RISK ASSESSMENT OF METALS

    EPA Science Inventory

    EPA has recognized the need for consistent application of methods and data to metals risk assessment in consideration of the unique properties of metals. To inform the consideration of metals properties, and to engage the external scientific community, the Agency commissioned ext...

  10. Addressing Uncertainty in the ISCORS Multimedia Radiological Dose Assessment of Municipal Sewage Sludge and Ash

    NASA Astrophysics Data System (ADS)

    Chiu, W. A.; Bachmaier, J.; Bastian, R.; Hogan, R.; Lenhart, T.; Schmidt, D.; Wolbarst, A.; Wood, R.; Yu, C.

    2002-05-01

    Managing municipal wastewater at publicly owned treatment works (POTWs) leads to the production of considerable amounts of residual solid material, which is known as sewage sludge or biosolids. If the wastewater entering a POTW contains radioactive material, then the treatment process may concentrate radionuclides in the sludge, leading to possible exposure of the general public or the POTW workers. The Sewage Sludge Subcommittee of the Interagency Steering Committee on Radiation Standards (ISCORS), which consists of representatives from the Environmental Protection Agency, the Nuclear Regulatory Commission, the Department of Energy, and several other federal, state, and local agencies, is developing guidance for POTWs on the management of sewage sludge that may contain radioactive materials. As part of this effort, they are conducting an assessment of potential radiation exposures using the Department of Energy's RESidual RADioactivity (RESRAD) family of computer codes developed by Argonne National Laboratory. This poster describes several approaches used by the Subcommittee to address the uncertainties associated with their assessment. For instance, uncertainties in the source term are addressed through a combination of analytic and deterministic computer code calculations. Uncertainties in the exposure pathways are addressed through the specification of a number of hypothetical scenarios, some of which can be scaled to address changes in exposure parameters. In addition, the uncertainty in some physical and behavioral parameters are addressed through probabilistic methods.

  11. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. PMID:26268067

  12. Painting the world REDD: addressing scientific barriers to monitoring emissions from tropical forests

    NASA Astrophysics Data System (ADS)

    Asner, Gregory P.

    2011-06-01

    project scale to program readiness is a big step for all involved, and many are finding that it is not easy. Current barriers to national monitoring of forest carbon stocks and emissions range from technical to scientific, and from institutional to operational. In fact, a recent analysis suggested that about 3% of tropical countries currently have the capacity to monitor and report on changes in forest cover and carbon stocks (Herold 2009). But until now, the scientific and policy-development communities have had little quantitative information on exactly which aspects of national-scale monitoring are most uncertain, and how that uncertainty will affect REDD+ performance reporting. A new and remarkable study by Pelletier, Ramankutty and Potvin (2011) uses an integrated, spatially-explicit modeling technique to explore and quantify sources of uncertainty in carbon emissions mapping throughout the Republic of Panama. Their findings are sobering: deforestation rates would need to be reduced by a full 50% in Panama in order to be detectable above the statistical uncertainty caused by several current major monitoring problems. The number one uncertainty, accounting for a sum total of about 77% of the error, rests in the spatial variation of aboveground carbon stocks in primary forests, secondary forests and on fallow land. The poor quality of and insufficient time interval between land-cover maps account for the remainder of the overall uncertainty. These findings are a show-stopper for REDD+ under prevailing science and technology conditions. The Pelletier et al study highlights the pressing need to improve the accuracy of forest carbon and land cover mapping assessments in order for REDD+ to become viable, but how can the uncertainties be overcome? First, with REDD+ nations required to report their emissions, and with verification organizations wanting to check on the reported numbers, there is a clear need for shared measurement and monitoring approaches. One of the major

  13. Teaching Scientific Measurement and Uncertainty in Elementary School

    ERIC Educational Resources Information Center

    Munier, Valérie; Merle, Hélène; Brehelin, Danie

    2013-01-01

    The concept of measurement is fundamental in science. In order to be meaningful, the value of a measurement must be given with a certain level of uncertainty. In this paper we try to identify and develop the reasoning of young French pupils about measurement variability. In France, official instructions for elementary school thus argue for having…

  14. Teaching Scientific Measurement and Uncertainty in Elementary School

    NASA Astrophysics Data System (ADS)

    Munier, Valérie; Merle, Hélène; Brehelin, Danie

    2013-11-01

    The concept of measurement is fundamental in science. In order to be meaningful, the value of a measurement must be given with a certain level of uncertainty. In this paper we try to identify and develop the reasoning of young French pupils about measurement variability. In France, official instructions for elementary school thus argue for having students do activities of measurement, followed by treatments and analysis of the data. The notion of measurement 'uncertainty' appears in fourth and fifth grades. A similar approach is proposed in the USA. We present a teaching sequence divided into two parts: the first part in grade 4, the second one in grade 5, the following year, with the same students. The main sources of data were field notes, videotapes, as well as the intermediate written traces produced, individual written tests given each year and clinical interview. We showed that the pupils were capable of entertaining all three possible causes of uncertainty (the quantity being measured, the measuring instrument, and the measurer). Concerning data organization and handling, we found that after teaching, most of them were able to construct a frequency table and a bar chart from a list of N measures of the same quantity. When interpreting this type of chart, some of them were able to argue in terms of a confidence interval. We have also shown that the proposed instructional units allowed pupils to become aware of the need to repeat measurements.

  15. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  16. Scientific Uncertainty in News Coverage of Cancer Research: Effects of Hedging on Scientists' and Journalists' Credibility

    ERIC Educational Resources Information Center

    Jensen, Jakob D.

    2008-01-01

    News reports of scientific research are rarely hedged; in other words, the reports do not contain caveats, limitations, or other indicators of scientific uncertainty. Some have suggested that hedging may influence news consumers' perceptions of scientists' and journalists' credibility (perceptions that may be related to support for scientific…

  17. Measuring the perceived uncertainty of scientific evidence and its relationship to engagement with science.

    PubMed

    Retzbach, Joachim; Otto, Lukas; Maier, Michaela

    2016-08-01

    Many scholars have argued for the need to communicate openly not only scientific successes to the public but also limitations, such as the tentativeness of research findings, in order to enhance public trust and engagement. Yet, it has not been quantitatively assessed how the perception of scientific uncertainties relates to engagement with science on an individual level. In this article, we report the development and testing of a new questionnaire in English and German measuring the perceived uncertainty of scientific evidence. Results indicate that the scale is reliable and valid in both language versions and that its two subscales are differentially related to measures of engagement: Science-friendly attitudes were positively related only to 'subjectively' perceived uncertainty, whereas interest in science as well as behavioural engagement actions and intentions were largely uncorrelated. We conclude that perceiving scientific knowledge to be uncertain is only weakly, but positively related to engagement with science. PMID:25814513

  18. Towards a common oil spill risk assessment framework – Adapting ISO 31000 and addressing uncertainties.

    PubMed

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio; Janeiro, Joao; Samaras, Achilleas; Zodiatis, George; De Dominicis, Michela

    2015-08-15

    Oil spills are a transnational problem, and establishing a common standard methodology for Oil Spill Risk Assessments (OSRAs) is thus paramount in order to protect marine environments and coastal communities. In this study we firstly identified the strengths and weaknesses of the OSRAs carried out in various parts of the globe. We then searched for a generic and recognized standard, i.e. ISO 31000, in order to design a method to perform OSRAs in a scientific and standard way. The new framework was tested for the Lebanon oil spill that occurred in 2006 employing ensemble oil spill modeling to quantify the risks and uncertainties due to unknown spill characteristics. The application of the framework generated valuable visual instruments for the transparent communication of the risks, replacing the use of risk tolerance levels, and thus highlighting the priority areas to protect in case of an oil spill. PMID:26067897

  19. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  20. Demand for command: responding to technological risks and scientific uncertainties.

    PubMed

    Stokes, Elen

    2013-01-01

    This article seeks to add to current theories of new governance by highlighting the predicament facing regulators and regulatees when dealing with new technologies. Using nanotechnologies as a study, it shows that new modes of governance (as opposed to traditional coercive, or command and control regulation) offer promising solutions to highly complex, uncertain, and contested problems of risk, such as those associated with new technologies. In this regard, nanotechnologies provide a useful test bed for the ambitions of newer, better modes of governance because there are not yet any fixed ideas about the appropriate course of action. The article suggests, however, that examples of new governance are less prominent than perhaps expected. Drawing on empirical data, it argues that, when faced with considerable epistemological, political, economic, and ethical uncertainties, regulatory stakeholders often exhibit a preference for more conventional command methods of regulation. That is not to say that new governance is entirely absent from regulatory policies on nanotechnologies, but that new governance is emerging in perhaps more subtle ways than the scholarly and policy literature predicted. PMID:23329016

  1. Science Teachers' Use of Mass Media to Address Socio-Scientific and Sustainability Issues

    NASA Astrophysics Data System (ADS)

    Klosterman, Michelle L.; Sadler, Troy D.; Brown, Julie

    2012-01-01

    The currency, relevancy and changing nature of science makes it a natural topic of focus for mass media outlets. Science teachers and students can capitalize on this wealth of scientific information to explore socio-scientific and sustainability issues; however, without a lens on how those media are created and how representations of science are constructed through media, the use of mass media in the science classroom may be risky. Limited research has explored how science teachers naturally use mass media to explore scientific issues in the classroom or how mass media is used to address potential overlaps between socio-scientific-issue based instruction and education for sustainability. This naturalistic study investigated the reported and actual classroom uses of mass media by secondary science teachers' to explore socio-scientific and sustainability issues as well as the extent to which their instructional approaches did or did not overlap with frameworks for SSI-based instruction, education for sustainability, and media literacy education. The results of this study suggest that secondary science teachers use mass media to explore socio-scientific and sustainability issues, but their use of frameworks aligned with SSI-based, education for sustainability, and media literacy education was limited. This paper provides suggestions for how we, as science educators and researchers, can advance a teaching and learning agenda for encouraging instruction that more fully utilizes the potential of mass media to explore socio-scientific issues in line with perspectives from education for sustainability.

  2. Painting the world REDD: addressing scientific barriers to monitoring emissions from tropical forests

    NASA Astrophysics Data System (ADS)

    Asner, Gregory P.

    2011-06-01

    project scale to program readiness is a big step for all involved, and many are finding that it is not easy. Current barriers to national monitoring of forest carbon stocks and emissions range from technical to scientific, and from institutional to operational. In fact, a recent analysis suggested that about 3% of tropical countries currently have the capacity to monitor and report on changes in forest cover and carbon stocks (Herold 2009). But until now, the scientific and policy-development communities have had little quantitative information on exactly which aspects of national-scale monitoring are most uncertain, and how that uncertainty will affect REDD+ performance reporting. A new and remarkable study by Pelletier, Ramankutty and Potvin (2011) uses an integrated, spatially-explicit modeling technique to explore and quantify sources of uncertainty in carbon emissions mapping throughout the Republic of Panama. Their findings are sobering: deforestation rates would need to be reduced by a full 50% in Panama in order to be detectable above the statistical uncertainty caused by several current major monitoring problems. The number one uncertainty, accounting for a sum total of about 77% of the error, rests in the spatial variation of aboveground carbon stocks in primary forests, secondary forests and on fallow land. The poor quality of and insufficient time interval between land-cover maps account for the remainder of the overall uncertainty. These findings are a show-stopper for REDD+ under prevailing science and technology conditions. The Pelletier et al study highlights the pressing need to improve the accuracy of forest carbon and land cover mapping assessments in order for REDD+ to become viable, but how can the uncertainties be overcome? First, with REDD+ nations required to report their emissions, and with verification organizations wanting to check on the reported numbers, there is a clear need for shared measurement and monitoring approaches. One of the major

  3. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has

  4. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS II: SENSITIVITY ANALYSIS IN THE CONUS DOMAIN

    EPA Science Inventory

    In this study, we present the response of model results to different scientific treatments in an effort to quantify the uncertainties caused by the incomplete understanding of mercury science and by model assumptions in atmospheric mercury models. Two sets of sensitivity simulati...

  5. Children's Understanding of Scientific Inquiry: Their Conceptualization of Uncertainty in Investigations of Their Own Design

    ERIC Educational Resources Information Center

    Metz, Kathleen E.

    2004-01-01

    The study examined children's understanding of scientific inquiry, through the lens of their conceptualization of uncertainty in investigations they had designed and implemented with a partner. These largely student-regulated investigations followed a unit about animal behavior that emphasized the scaffolding of independent inquiry. Participants…

  6. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits

    PubMed Central

    Dankovic, D. A.; Naumann, B. D.; Maier, A.; Dourson, M. L.; Levy, L. S.

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties—typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also

  7. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open

  8. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  9. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  10. Scientific rationality, uncertainty and the governance of human genetics: an interview study with researchers at deCODE genetics.

    PubMed

    Hjörleifsson, Stefán; Schei, Edvin

    2006-07-01

    Technology development in human genetics is fraught with uncertainty, controversy and unresolved moral issues, and industry scientists are sometimes accused of neglecting the implications of their work. The present study was carried out to elicit industry scientists' reflections on the relationship between commercial, scientific and ethical dimensions of present day genetics and the resources needed for robust governance of new technologies. Interviewing scientists of the company deCODE genetics in Iceland, we found that in spite of optimism, the informants revealed ambiguity and uncertainty concerning the use of human genetic technologies for the prevention of common diseases. They concurred that uncritical marketing of scientific success might cause exaggerated public expectations of health benefits from genetics, with the risk of backfiring and causing resistance to genetics in the population. On the other hand, the scientists did not address dilemmas arising from the commercial nature of their own employer. Although the scientists tended to describe public fear as irrational, they identified issues where scepticism might be well founded and explored examples where they, despite expert knowledge, held ambiguous or tentative personal views on the use of predictive genetic technologies. The rationality of science was not seen as sufficient to ensure beneficial governance of new technologies. The reflexivity and suspension of judgement demonstrated in the interviews exemplify productive features of moral deliberation in complex situations. Scientists should take part in dialogues concerning the governance of genetic technologies, acknowledge any vested interests, and use their expertise to highlight, not conceal the technical and moral complexity involved. PMID:16622446

  11. Addressing Uncertainty in Contaminant Transport in Groundwater Using the Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Mohanty, B. P.

    2011-12-01

    Nitrate in groundwater shows significant uncertainty which arises from sparse data and interaction among multiple geophysical factors such as source availability (land use), thickness and composition of the vadose zone, types of aquifers (confined or unconfined), aquifer heterogeneity (geological and alluvial), precipitation characteristics, etc. This work presents the fusion of the ensemble Kalman filter (EnKF) with the numerical groundwater flow model MODFLOW and the solute transport model MT3DMS. The EnKF is a sequential data assimilation approach, which is applied to quantify and reduce the uncertainty of groundwater flow and solute transport models. We conducted numerical simulation experiments for the period January 1990 to December 2005 with MODFLOW and MT3DMS models for variably saturated groundwater flow in various aquifers across Texas. The EnKF was used to update the model parameters, hydraulic conductivity, hydraulic head and solute concentration. Results indicate that the EnKF method notably improves the estimation of the hydraulic conductivity distribution and solute transport prediction by assimilating piezometric head measurements with a known nitrate initial condition. A better estimation of hydraulic conductivity and assimilation of continuous measurements of solute concentrations resulted in reduced uncertainty in MODFLOW and MT3DMS models. It was found that the observation locations and locations in spatial proximity were appropriately corrected by the EnKF. The knowledge of nitrate plume evolution provided an insight into model structure, parameters, and sources of uncertainty.

  12. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... characterized with a probability distribution that accounts for the uncertainty of the estimate. This information will be used by DOL in the calculation of probability of causation, under HHS guidelines for calculating probability of causation estimates at 42 CFR 81. In this way, claimants will receive the...

  13. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  14. The future of human embryonic stem cell research: addressing ethical conflict with responsible scientific research.

    PubMed

    Gilbert, David M

    2004-05-01

    Embryonic stem (ES) cells have almost unlimited regenerative capacity and can potentially generate any body tissue. Hence they hold great promise for the cure of degenerative human diseases. But their derivation and the potential for misuse have raised a number of ethical issues. These ethical issues threaten to paralyze pubic funding for ES cell research, leaving experimentation in the hands of the private sector and precluding the public's ability to monitor practices, research alternatives, and effectively address the very ethical issues that are cause for concern in the first place. With new technology being inevitable, and the potential for abuse high, government must stay involved if the public is to play a role in shaping the direction of research. In this essay, I will define levels of ethical conflict that can be delineated by the anticipated advances in technology. From the urgent need to derive new ES cell lines with existing technology, to the most far-reaching goal of deriving genetically identical tissues from an adult patients cells, technology-specific ethical dilemmas can be defined and addressed. This staged approach provides a solid ethical framework for moving forward with ES cell research. Moreover, by anticipating the moral conflicts to come, one can predict the types of scientific advances that could overcome these conflicts, and appropriately direct federal funding toward these goals to offset potentially less responsible research directives that will inevitably go forward via private or foreign funding. PMID:15114283

  15. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information.

    PubMed

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople's understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should

  16. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information

    PubMed Central

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople’s understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should

  17. Addressing the unique safety and design concerns for operating tower-based scientific field campaigns.

    NASA Astrophysics Data System (ADS)

    Steele, A. C.

    2006-12-01

    Scientific field campaigns often require specialized technical infrastructure for data collection. NASA's LBA- ECO Science Team needed a network of towers, up to 65 meters in height, to be constructed in the Amazon forest to serve as platforms for instrumentation used to estimate carbon dioxide and trace gas fluxes between the forest and the atmosphere. The design, construction, and operation of these scientific towers represented unique challenges to the construction crews, the logistics support staff, and the scientists due to operational requirements beyond tower site norms. These included selection of safe sites at remote locations within a dense forest; building towers without damaging the natural environment; locating diesel generators so that exhaust would not contaminate the measurement area; performing maintenance on continuously energized towers so as not to interrupt data collection; training inexperienced climbers needing safe access to towers; and addressing unique safety concerns (e.g. venomous animal response, chainsaw safety, off road driving). To meet the challenges of the complex field site, a comprehensive safety and site operation model was designed to ensure that NASA field safety standards were met, even under extreme conditions in the remote forests of the Amazon. The model includes all phases of field site safety and operation, including site design, construction, operational practices and policies, and personnel safety training. This operational model was employed over eight years, supporting a team of nearly 400 scientists, making several thousand site visits, without loss of life or major injury. The presentation will explore these concerns and present a model for comprehensive safety plans for NASA field missions.

  18. Mars 2001 Mission: Addressing Scientific Questions Regarding the Characteristics and Origin of Local Bedrock and Soil

    NASA Technical Reports Server (NTRS)

    Saunders, R. S.; Arvidson, R. E.; Weitz, C. M.; Marshall, J.; Squyres, S. W.; Christensen, P. R.; Meloy, T.; Smith, P.

    1999-01-01

    The Mars Surveyor Program 2001 Mission will carry instruments on the orbiter, lander and rover that will support synergistic observations and experiments to address important scientific questions regarding the local bedrock and soils. The martian surface is covered in varying degrees by fine materials less than a few mms in size. Viking and Pathfinder images of the surface indicate that soils at those sites are composed of fine particles. Wheel tracks from the Sojourner rover suggest that soil deposits are composed of particles <40 mm. Viking images show that dunes are common in many areas on Mars and new MOC images indicate that dunes occur nearly everywhere. Dunes on Mars are thought to be composed of 250-500 microns particles based upon Viking IRTM data and Mars wind tunnel experiments. If martian dunes are composed of sand particles > 100 microns and soils are dominated by <10 micron particles, then where are the intermediate grain sizes? Have they been wom away through prolonged transport over the eons? Were they never generated to begin with? Or are they simply less easy to identify because do they not form distinctive geomorphic features such as dunes or uniform mantles that tend to assume superposition in the soil structure?

  19. Scientific English: a program for addressing linguistic barriers of international research trainees in the United States.

    PubMed

    Cameron, Carrie; Chang, Shine; Pagel, Walter

    2011-03-01

    Within the international research environment, English is indisputably the lingua franca, and thus, the majority of the world's scientists must adapt to a second language. Linguistic barriers in science affect not only researchers' career paths but institutional productivity and efficiency as well. To address these barriers, we designed and piloted a specialized course, Scientific English. The pedagogical approach is based on English for specific purposes methodology in which curriculum and content are driven by the types of daily language used and interactions which occur in the participants' occupation, in this case, cancer research. The 11-week program was organized into three sections: presentation skill, meeting and discussion skills, and writing skills. Effectiveness of the course was measured by the number of participants able to produce the presentations and written products with a score of at least 75 of 100 possible points. From January to December 2008, participant scores averaged 90.4 for presentation and 86.8 for written products. The authors provide insights and recommendations on the development and delivery of the program. PMID:20623348

  20. Adapting to climate change despite scientific uncertainty: A case study of coastal protection from sea-level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.

    2013-12-01

    Climate change adaptation is an increasing focus of international aid. At recent meetings of the parties to the United Nations Framework Convention on Climate Change (UNFCCC), the developed world agreed to rapidly increase international assistance to help developing countries, like the low-lying island nation of Kiribati, respond to the impacts of climate change. These emerging adaptation efforts must proceed despite the large and partially irreducible scientific uncertainty about the magnitude of those future climate impacts. In this study, we use the example of efforts to adapt to sea-level rise in Kiribati to document the challenges facing such internationally-funded climate change adaptation projects given the scientific uncertainty about climate impacts. Drawing on field and document research, we describe the scientific uncertainty about projected sea-level rise in Tarawa, the capital of Kiribati, how that uncertainty can create trade-offs between adaptation measures, and the social, political and economic context in which adaptation decisions must be made. The analysis shows there is no 'silver bullet' adaptation strategy in countries like Kiribati, given the long-term scientific uncertainty about sea-level rise and the environment of climate change aid. The existence of irreducible scientific uncertainty does not preclude effective climate change adaptation, but instead requires adaptation programs that embrace multiple strategies and planning horizons, and continually build on and re-adjust previous investments. This work highlights the importance of sustained international climate change financing, as proposed in UNFCCC negotiations.

  1. Science Teachers' Use of Mass Media to Address Socio-Scientific and Sustainability Issues

    ERIC Educational Resources Information Center

    Klosterman, Michelle L.; Sadler, Troy D.; Brown, Julie

    2012-01-01

    The currency, relevancy and changing nature of science makes it a natural topic of focus for mass media outlets. Science teachers and students can capitalize on this wealth of scientific information to explore socio-scientific and sustainability issues; however, without a lens on how those media are created and how representations of science are…

  2. Addressing the Dynamics of Science in Curricular Reform for Scientific Literacy: The Case of Genomics

    ERIC Educational Resources Information Center

    van Eijck, Michiel

    2010-01-01

    Science education reform must anticipate the scientific literacy required by the next generation of citizens. Particularly, this counts for rapidly emerging and evolving scientific disciplines such as genomics. Taking this discipline as a case, such anticipation is becoming increasingly problematic in today's knowledge societies in which the…

  3. Advanced Test Reactor National Scientific User Facility: Addressing advanced nuclear materials research

    SciTech Connect

    John Jackson; Todd Allen; Frances Marshall; Jim Cole

    2013-03-01

    The Advanced Test Reactor National Scientific User Facility (ATR NSUF), based at the Idaho National Laboratory in the United States, is supporting Department of Energy and industry research efforts to ensure the properties of materials in light water reactors are well understood. The ATR NSUF is providing this support through three main efforts: establishing unique infrastructure necessary to conduct research on highly radioactive materials, conducting research in conjunction with industry partners on life extension relevant topics, and providing training courses to encourage more U.S. researchers to understand and address LWR materials issues. In 2010 and 2011, several advanced instruments with capability focused on resolving nuclear material performance issues through analysis on the micro (10-6 m) to atomic (10-10 m) scales were installed primarily at the Center for Advanced Energy Studies (CAES) in Idaho Falls, Idaho. These instruments included a local electrode atom probe (LEAP), a field-emission gun scanning transmission electron microscope (FEG-STEM), a focused ion beam (FIB) system, a Raman spectrometer, and an nanoindentor/atomic force microscope. Ongoing capability enhancements intended to support industry efforts include completion of two shielded, irradiation assisted stress corrosion cracking (IASCC) test loops, the first of which will come online in early calendar year 2013, a pressurized and controlled chemistry water loop for the ATR center flux trap, and a dedicated facility intended to house post irradiation examination equipment. In addition to capability enhancements at the main site in Idaho, the ATR NSUF also welcomed two new partner facilities in 2011 and two new partner facilities in 2012; the Oak Ridge National Laboratory, High Flux Isotope Reactor (HFIR) and associated hot cells and the University California Berkeley capabilities in irradiated materials analysis were added in 2011. In 2012, Purdue University’s Interaction of Materials

  4. Novel developments in benthic modelling to address scientific and policy challenges

    NASA Astrophysics Data System (ADS)

    Lessin, Gennadi; Artioli, Yuri; Bruggeman, Jorn; Aldridge, John; Blackford, Jerry

    2016-04-01

    Understanding the role of benthic systems in supporting, regulating and providing marine ecosystem services requires better understanding of their functioning and their response and resilience to stressors. Novel observational methods for the investigation of dynamics of benthic-pelagic coupling in shelf seas are being developed and new data is being collected. Therefore there is an increasing demand for robust representation of benthic processes in marine biogeochemical and ecosystem models, which would improve our understanding of whole systems and benthic-pelagic coupling, rather than act as mere closure terms for pelagic models. However, for several decades development of benthic models has lagged behind their pelagic counterparts. To address contemporary scientific, policy and societal challenges, the biogeochemical and ecological model ERSEM (European Regional Seas Ecosystem Model), including its benthic sub-model, was recently recoded in a scalable and modular format adopting the approach of FABM (Framework for Aquatic Biogeochemical Models). Within the Shelf Sea Biogeochemistry research programme, a series of additional processes have been included, such as a sedimentary carbonate system, a resuspendable fluff layer, and the simulation of advective sediments. It was shown that the inclusion of these processes changes the dynamics of benthic-pelagic fluxes as well as modifying the benthic food web. Comparison of model results with in-situ data demonstrated a general improvement of model performance and highlighted the importance of the benthic system in overall ecosystem dynamics. As an example, our simulations have shown that inclusion of a resuspendable fluff layer facilitates regeneration of inorganic nutrients in the water column due to degradation of resuspended organic material by pelagic bacteria. Moreover, the composition of fluff was found to be important for trophic interactions, and therefore indirectly affects benthic community composition. Where

  5. A Concept Space Approach to Addressing the Vocabulary Problem in Scientific Information Retrieval: An Experiment on the Worm Community System.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Ng, Tobun D.; Martinez, Joanne; Schatz, Bruce R.

    1997-01-01

    Presents an algorithmic approach to addressing the vocabulary problem in scientific information retrieval and information sharing, using the molecular biology domain as an example. A cognitive study and a follow-up document retrieval study were conducted using first a conjoined fly-worm thesaurus and then an actual worm database and the conjoined…

  6. Progression in Ethical Reasoning When Addressing Socio-Scientific Issues in Biotechnology

    ERIC Educational Resources Information Center

    Berne, Birgitta

    2014-01-01

    This article reports on the outcomes of an intervention in a Swedish school in which the author, a teacher-researcher, sought to develop students' (14-15 years old) ethical reasoning in science through the use of peer discussions about socio-scientific issues. Prior to the student discussions various prompts were used to highlight different…

  7. Scientific problems addressed by the Spektr-UV space project (world space Observatory—Ultraviolet)

    NASA Astrophysics Data System (ADS)

    Boyarchuk, A. A.; Shustov, B. M.; Savanov, I. S.; Sachkov, M. E.; Bisikalo, D. V.; Mashonkina, L. I.; Wiebe, D. Z.; Shematovich, V. I.; Shchekinov, Yu. A.; Ryabchikova, T. A.; Chugai, N. N.; Ivanov, P. B.; Voshchinnikov, N. V.; Gomez de Castro, A. I.; Lamzin, S. A.; Piskunov, N.; Ayres, T.; Strassmeier, K. G.; Jeffrey, S.; Zwintz, S. K.; Shulyak, D.; Gérard, J.-C.; Hubert, B.; Fossati, L.; Lammer, H.; Werner, K.; Zhilkin, A. G.; Kaigorodov, P. V.; Sichevskii, S. G.; Ustamuich, S.; Kanev, E. N.; Kil'pio, E. Yu.

    2016-01-01

    The article presents a review of scientific problems and methods of ultraviolet astronomy, focusing on perspective scientific problems (directions) whose solution requires UV space observatories. These include reionization and the history of star formation in the Universe, searches for dark baryonic matter, physical and chemical processes in the interstellar medium and protoplanetary disks, the physics of accretion and outflows in astrophysical objects, from Active Galactic Nuclei to close binary stars, stellar activity (for both low-mass and high-mass stars), and processes occurring in the atmospheres of both planets in the solar system and exoplanets. Technological progress in UV astronomy achieved in recent years is also considered. The well advanced, international, Russian-led Spektr-UV (World Space Observatory—Ultraviolet) project is described in more detail. This project is directed at creating a major space observatory operational in the ultraviolet (115-310 nm). This observatory will provide an effective, and possibly the only, powerful means of observing in this spectral range over the next ten years, and will be an powerful tool for resolving many topical scientific problems.

  8. Encouraging Uncertainty in the "Scientific Method": Promoting Understanding in the Processes of Science with Preservice Teachers

    ERIC Educational Resources Information Center

    Melville, Wayne; Bartley, Anthony; Fazio, Xavier

    2012-01-01

    Teachers' feelings of uncertainty are an overlooked, though crucial, condition necessary for the promotion of educational change. This article investigates the feelings of uncertainty that preservice teachers have toward the conduct of science as inquiry and the extent to which methods courses can confront and embrace those uncertainties. Our work…

  9. Progression in Ethical Reasoning When Addressing Socio-scientific Issues in Biotechnology

    NASA Astrophysics Data System (ADS)

    Berne, Birgitta

    2014-11-01

    This article reports on the outcomes of an intervention in a Swedish school in which the author, a teacher-researcher, sought to develop students' (14-15 years old) ethical reasoning in science through the use of peer discussions about socio-scientific issues. Prior to the student discussions various prompts were used to highlight different aspects of the issues. In addition, students were given time to search for further information themselves. Analysis of students' written arguments, from the beginning of the intervention and afterwards, suggests that many students seem to be moving away from their use of everyday language towards using scientific concepts in their arguments. In addition, they moved from considering cloning and 'designer babies' solely in terms of the present to considering them in terms of the future. Furthermore, the students started to approach the issues in additional ways using not only consequentialism but also the approaches of virtue ethics, and rights and duties. Students' progression in ethical reasoning could be related to the characteristics of the interactions in peer discussions as students who critically and constructively argued with each other's ideas, and challenged each other's claims, made progress in more aspects of ethical reasoning than students merely using cumulative talk. As such, the work provides valuable indications for the importance of introducing peer discussions and debates about SSIs in connection to biotechnology into the teaching of science in schools.

  10. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  11. Using Websites to Convey Scientific Uncertainties for Volcanic Processes and Potential Hazards

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Lowenstern, J. B.; Hill, D. P.

    2005-12-01

    The Yellowstone Volcano Observatory (YVO) and Long Valley Observatory (LVO) websites have greatly increased the public's awareness and access to information about scientific uncertainties for volcanic processes by communicating at multiple levels of understanding and varied levels of detail. Our websites serve a broad audience ranging from visitors unaware of the calderas, to lay volcano enthusiasts, to scientists, federal agencies, and emergency managers. Both Yellowstone and Long Valley are highly visited tourist attractions with histories of caldera-forming eruptions large enough to alter global climate temporarily. Although it is much more likely that future activity would be on a small scale at either volcano, we are constantly posed questions about low-probability, high-impact events such as the caldera-forming eruption depicted in the recent BBC/Discovery movie, "Supervolcano". YVO and LVO website objectives include: providing monitoring data, explaining the likelihood of future events, summarizing research results, helping media provide reliable information, and expanding on information presented by the media. Providing detailed current information is a crucial website component as the public often searches online to augment information gained from often cryptic pronouncements by the media. In May 2005, for example, YVO saw an order of magnitude increase in page requests on the day MSNBC ran the misleading headline, "Yellowstone eruption threat high." The headline referred not to current events but a general rating of Yellowstone as one of 37 "high threat" volcanoes in the USGS National Volcano Early Warning System report. As websites become a more dominant source of information, we continuously revise our communication plans to make the most of this evolving medium. Because the internet gives equal access to all information providers, we find ourselves competing with various "doomsday" websites that sensationalize and distort the current understanding of

  12. Addressing Emerging Risks: Scientific and Regulatory Challenges Associated with Environmentally Persistent Free Radicals

    PubMed Central

    Dugas, Tammy R.; Lomnicki, Slawomir; Cormier, Stephania A.; Dellinger, Barry; Reams, Margaret

    2016-01-01

    Airborne fine and ultrafine particulate matter (PM) are often generated through widely-used thermal processes such as the combustion of fuels or the thermal decomposition of waste. Residents near Superfund sites are exposed to PM through the inhalation of windblown dust, ingestion of soil and sediments, and inhalation of emissions from the on-site thermal treatment of contaminated soils. Epidemiological evidence supports a link between exposure to airborne PM and an increased risk of cardiovascular and pulmonary diseases. It is well-known that during combustion processes, incomplete combustion can lead to the production of organic pollutants that can adsorb to the surface of PM. Recent studies have demonstrated that their interaction with metal centers can lead to the generation of a surface stabilized metal-radical complex capable of redox cycling to produce ROS. Moreover, these free radicals can persist in the environment, hence their designation as Environmentally Persistent Free Radicals (EPFR). EPFR has been demonstrated in both ambient air PM2.5 (diameter < 2.5 µm) and in PM from a variety of combustion sources. Thus, low-temperature, thermal treatment of soils can potentially increase the concentration of EPFR in areas in and around Superfund sites. In this review, we will outline the evidence to date supporting EPFR formation and its environmental significance. Furthermore, we will address the lack of methodologies for specifically addressing its risk assessment and challenges associated with regulating this new, emerging contaminant. PMID:27338429

  13. Addressing Emerging Risks: Scientific and Regulatory Challenges Associated with Environmentally Persistent Free Radicals.

    PubMed

    Dugas, Tammy R; Lomnicki, Slawomir; Cormier, Stephania A; Dellinger, Barry; Reams, Margaret

    2016-01-01

    Airborne fine and ultrafine particulate matter (PM) are often generated through widely-used thermal processes such as the combustion of fuels or the thermal decomposition of waste. Residents near Superfund sites are exposed to PM through the inhalation of windblown dust, ingestion of soil and sediments, and inhalation of emissions from the on-site thermal treatment of contaminated soils. Epidemiological evidence supports a link between exposure to airborne PM and an increased risk of cardiovascular and pulmonary diseases. It is well-known that during combustion processes, incomplete combustion can lead to the production of organic pollutants that can adsorb to the surface of PM. Recent studies have demonstrated that their interaction with metal centers can lead to the generation of a surface stabilized metal-radical complex capable of redox cycling to produce ROS. Moreover, these free radicals can persist in the environment, hence their designation as Environmentally Persistent Free Radicals (EPFR). EPFR has been demonstrated in both ambient air PM2.5 (diameter < 2.5 µm) and in PM from a variety of combustion sources. Thus, low-temperature, thermal treatment of soils can potentially increase the concentration of EPFR in areas in and around Superfund sites. In this review, we will outline the evidence to date supporting EPFR formation and its environmental significance. Furthermore, we will address the lack of methodologies for specifically addressing its risk assessment and challenges associated with regulating this new, emerging contaminant. PMID:27338429

  14. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  15. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    SciTech Connect

    Vierow, Karen; Aldemir, Tunc

    2009-09-10

    The project entitled, “Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors”, was conducted as a DOE NERI project collaboration between Texas A&M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  16. Using Next Generation Science Standards (NGSS) Practices to Address Scientific Misunderstandings Around Complex Environmental Issues

    NASA Astrophysics Data System (ADS)

    Turrin, M.; Kenna, T. C.

    2014-12-01

    The new NGSS provide an important opportunity for scientists to develop curriculum that links the practice of science to research-based data in order to improve understanding in areas of science that are both complex and confusing. Our curriculum focuses in particular on the fate and transport of anthropogenic radionuclides. Radioactivity, both naturally occurring and anthropogenic, is highly debated and largely misunderstood, and for large sections of the population is a source of scientific misunderstanding. Developed as part of the international GEOTRACES project which focuses on identifying ocean processes and quantifying fluxes that control the distributions of selected trace elements and isotopes in the ocean, and on establishing the sensitivity of these distributions to changing environmental conditions, the curriculum topic fits nicely into the applied focus of NGSS with both environmental and topical relevance. Our curriculum design focuses on small group discussion driven by questions, yet unlike more traditional curriculum pieces these are not questions posed to the students, rather they are questions posed by the students to facilitate their deeper understanding. Our curriculum design challenges the traditional question/answer memorization approach to instruction as we strive to develop an educational approach that supports the practice of science as well as the NGSS Cross Cutting Concepts and the Science & Engineering Practices. Our goal is for students to develop a methodology they can employ when faced with a complex scientific issue. Through background readings and team discussions they identify what type of information is important for them to know and where to find a reliable source for that information. Framing their discovery around key questions such as "What type of radioactive decay are we dealing with?", "What is the potential half-life of the isotope?", and "What are the pathways of transport of radioactivity?" allows students to evaluate a

  17. Building non-traditional collaborations to innovatively address climate-related scientific and management needs

    NASA Astrophysics Data System (ADS)

    Bamzai, A.; Mcpherson, R. A.

    2014-12-01

    The South Central Climate Science Center (SC-CSC) is one of eight regional centers formed by the U.S. Department of the Interior in order to provide decision makers with the science, tools, and information they need to address the impacts of climate variability and change on their areas of responsibility. The SC-CSC is operated through the U.S. Geological Survey, in partnership with a consortium led by the University of Oklahoma that also includes Texas Tech University, Oklahoma State University, Louisiana State University, the Chickasaw Nation, the Choctaw Nation of Oklahoma, and NOAA's Geophysical Fluid Dynamics Lab (GFDL). The SC-CSC is distinct from all other CSCs in that we have strategically included non-traditional collaborators directly within our governing consortium. The SC-CSC is the only CSC to include any Tribal nations amongst our consortium (the Chickasaw Nation and the Choctaw Nation of Oklahoma) and to employ a full-time tribal liaison. As a result and in partnership with Tribes, we are able to identify the unique challenges that the almost 70 federally recognized Tribes within our region face. We also can develop culturally sensitive research projects or outreach efforts that bridge western science and traditional knowledge to address their needs. In addition, the SC-CSC is the only CSC to include another federal institution (GFDL) amongst our consortium membership. GFDL is a world-leader in climate modeling and model interpretation. Partnering GFDL's expertise in the evaluation of climate models and downscaling methods with the SC-CSC's stakeholder-driven approach allows for the generation and dissemination of guidance documents and training to accompany the high quality datasets already in development. This presentation will highlight the success stories and co-benefits of the SC-CSC's collaborations with Tribal nations and with GFDL, as well as include information on how other partners can connect to our ongoing efforts.

  18. How can present and future satellite missions support scientific studies that address ocean acidification?

    USGS Publications Warehouse

    Salisbury, Joseph; Vandemark, Douglas; Jonsson, Bror; Balch, William; Chakraborty, Sumit; Lohrenz, Steven; Chapron, Bertrand; Hales, Burke; Mannino, Antonio; Mathis, Jeremy T.; Reul, Nicolas; Signorini, Sergio; Wanninkhof, Rik; Yates, Kimberly K.

    2016-01-01

    Space-based observations offer unique capabilities for studying spatial and temporal dynamics of the upper ocean inorganic carbon cycle and, in turn, supporting research tied to ocean acidification (OA). Satellite sensors measuring sea surface temperature, color, salinity, wind, waves, currents, and sea level enable a fuller understanding of a range of physical, chemical, and biological phenomena that drive regional OA dynamics as well as the potentially varied impacts of carbon cycle change on a broad range of ecosystems. Here, we update and expand on previous work that addresses the benefits of space-based assets for OA and carbonate system studies. Carbonate chemistry and the key processes controlling surface ocean OA variability are reviewed. Synthesis of present satellite data streams and their utility in this arena are discussed, as are opportunities on the horizon for using new satellite sensors with increased spectral, temporal, and/or spatial resolution. We outline applications that include the ability to track the biochemically dynamic nature of water masses, to map coral reefs at higher resolution, to discern functional phytoplankton groups and their relationships to acid perturbations, and to track processes that contribute to acid variation near the land-ocean interface.

  19. Eliciting climate experts' knowledge to address model uncertainties in regional climate projections: a case study of Guanacaste, Northwest Costa Rica

    NASA Astrophysics Data System (ADS)

    Grossmann, I.; Steyn, D. G.

    2014-12-01

    Global general circulation models typically cannot provide the detailed and accurate regional climate information required by stakeholders for climate adaptation efforts, given their limited capacity to resolve the regional topography and changes in local sea surface temperature, wind and circulation patterns. The study region in Northwest Costa Rica has a tropical wet-dry climate with a double-peak wet season. During the dry season the central Costa Rican mountains prevent tropical Atlantic moisture from reaching the region. Most of the annual precipitation is received following the northward migration of the ITCZ in May that allows the region to benefit from moist southwesterly flow from the tropical Pacific. The wet season begins with a short period of "early rains" and is interrupted by the mid-summer drought associated with the intensification and westward expansion of the North Atlantic subtropical high in late June. Model projections for the 21st century indicate a lengthening and intensification of the mid-summer drought and a weakening of the early rains on which current crop cultivation practices rely. We developed an expert elicitation to systematically address uncertainties in the available model projections of changes in the seasonal precipitation pattern. Our approach extends an elicitation approach developed previously at Carnegie Mellon University. Experts in the climate of the study region or Central American climate were asked to assess the mechanisms driving precipitation during each part of the season, uncertainties regarding these mechanisms, expected changes in each mechanism in a warming climate, and the capacity of current models to reproduce these processes. To avoid overconfidence bias, a step-by-step procedure was followed to estimate changes in the timing and intensity of precipitation during each part of the season. The questions drew upon interviews conducted with the regions stakeholders to assess their climate information needs. This

  20. Trends in scientific activity addressing transmissible spongiform encephalopathies: a bibliometric study covering the period 1973–2002

    PubMed Central

    Sanz-Casado, Elías; Ramírez-de Santa Pau, Margarita; Suárez-Balseiro, Carlos A; Iribarren-Maestro, Isabel; de Pedro-Cuesta, Jesús

    2006-01-01

    Background The purpose of this study is to analyse the trends in scientific research on transmissible spongiform encephalopathies by applying bibliometric tools to the scientific literature published between 1973 and 2002. Methods The data for the study were obtained from Medline database, in order to determine the volume of scientific output in the above period, the countries involved, the type of document and the trends in the subject matters addressed. The period 1973–2002 was divided in three sub-periods. Results We observed a significant growth in scientific production. The percentage of increase is 871.7 from 1973 to 2002. This is more evident since 1991 and particularly in the 1996–2001 period. The countries found to have the highest output were the United States, the United Kingdom, Japan, France and Germany. The evolution in the subject matters was almost constant in the three sub-periods in which the study was divided. In the first and second sub-periods, the subject matters of greatest interest were more general, i.e Nervous system or Nervous system diseases, Creutzfeldt-Jakob disease, Scrapie, and Chemicals and Drugs, but in the last sub-period, some changes were observed because the Prion-related matters had the greatest presence. Collaboration among authors is small from 1973 to 1992, but increases notably in the third sub-period, and also the number of authors and clusters formed. Some of the authors, like Gajdusek or Prusiner, appear in the whole period. Conclusion The study reveals a very high increase in scientific production. It is related also with the beginnings of research on bovine spongiform encephalopathy and variant Creutzfeldt-Jakob disease, with the establishment of progressive collaboration relationships and a reflection of public health concerns about this problem. PMID:17026743

  1. The Uncertainty of Coastal Water Colour Products of S3: Implications for Scientific Applications and Monitoring

    NASA Astrophysics Data System (ADS)

    Doerffer, Roland; Brockmann, Carsten; Krasemann, Hajo; Muller, Dagmar

    2015-12-01

    This paper presents neural network based procedures to identify reflectance spectra, which are out of scope of the retrieval algorithm, and to determine uncertainties of OLCI products of optically complex coastal waters. It discusses the limited information content of reflectance spectra and presents examples how to improve the utilisation of these products by indicating their limitations and uncertainties for different types of waters.

  2. Implementation of Scientific Community Laboratories and Their Effect on Student Conceptual Learning, Attitudes, and Understanding of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lark, Adam

    Scientific Community Laboratories, developed by The University of Maryland, have shown initial promise as laboratories meant to emulate the practice of doing physics. These laboratories have been re-created by incorporating their design elements with the University of Toledo course structure and resources. The laboratories have been titled the Scientific Learning Community (SLC) Laboratories. A comparative study between these SLC laboratories and the University of Toledo physics department's traditional laboratories was executed during the fall 2012 semester on first semester calculus-based physics students. Three tests were executed as pre-test and post-tests to capture the change in students' concept knowledge, attitudes, and understanding of uncertainty. The Force Concept Inventory (FCI) was used to evaluate students' conceptual changes through the semester and average normalized gains were compared between both traditional and SLC laboratories. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) was conducted to elucidate students' change in attitudes through the course of each laboratory. Finally, interviews regarding data analysis and uncertainty were transcribed and coded to track changes in the way students understand uncertainty and data analysis in experimental physics after their participation in both laboratory type. Students in the SLC laboratories showed a notable an increase conceptual knowledge and attitudes when compared to traditional laboratories. SLC students' understanding of uncertainty showed most improvement, diverging completely from students in the traditional laboratories, who declined throughout the semester.

  3. Beyond Sound Bites and News Quotes: NSIDC's Arctic Sea Ice and News Analysis Web blog and Scientific Uncertainty

    NASA Astrophysics Data System (ADS)

    Beitler, J.; Vizcarra, N.; Scambos, T. A.; Meier, W.

    2013-12-01

    The public, the media, and policymakers often turn to scientists for definite answers about Earth's changing climate. Researchers, in turn, offer observations with the caveat that there is always an amount of uncertainty in the data and the analysis. More or better data could modify trends. Random processes in Earth's most complex systems, like clouds, could contradict current theories about how they evolve and develop into weather systems that could put lives in danger. Often, these careful answers are reduced to a soundbite in TV or a quote in a newspaper report, and the public gets an answer, but the message of scientific uncertainty is almost always omitted. Arctic sea ice has long been recognized as a sensitive climate indicator, and has undergone a dramatic decline over the past thirty years. The National Snow and Ice Data Center has evolved into a top source for sea ice data and analysis. How does NSIDC respond to increasing questions about Arctic sea ice and climate and how does it communicate scientific uncertainty? This poster examines the ways that NSIDC's Arctic Sea Ice News and Analysis (ASINA) Web blog offers the public a transparent view of sea ice analysis, with discussions in scientific uncertainty that are lost in news reportage. It highlights the Web blog's interactive sea ice graph called ChArctic, which the pubic can use to explore sea ice data by year and make their own observations. It also discusses issues around the replacement of NSIDC's 20-year baseline with a new 30-year baseline for analyzing sea ice.

  4. Trapped Between Two Tails: Trading Off Scientific Uncertainties via Climate Targets

    SciTech Connect

    Lemoine, Derek M.; McJeon, Haewon C.

    2013-08-20

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology- rich GCAM integrated assessment model to assess the robustness of 450 ppm and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  5. Trapped between two tails: trading off scientific uncertainties via climate targets

    NASA Astrophysics Data System (ADS)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  6. The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary

    ERIC Educational Resources Information Center

    Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul

    2012-01-01

    Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…

  7. MEETING IN TUCSON: MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental dec...

  8. MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental de...

  9. Quantifying Carbon Financial Risk in the International Greenhouse Gas Market: An Application Using Remotely-Sensed Data to Align Scientific Uncertainty with Financial Decisions

    NASA Astrophysics Data System (ADS)

    Hultman, N. E.

    2002-12-01

    A common complaint about environmental policy is that regulations inadequately reflect scientific uncertainty and scientific consensus. While the causes of this phenomenon are complex and hard to discern, we know that corporations are the primary implementers of environmental regulations; therefore, focusing on how policy relates scientific knowledge to corporate decisions can provide valuable insights. Within the context of the developing international market for greenhouse gas emissions, I examine how corporations would apply finance theory into their investment decisions for carbon abatement projects. Using remotely-sensed ecosystem scale carbon flux measurements, I show how to determine much financial risk of carbon is diversifiable. I also discuss alternative, scientifically sound methods for hedging the non-diversifiable risks in carbon abatement projects. In providing a quantitative common language for scientific and corporate uncertainties, the concept of carbon financial risk provides an opportunity for expanding communication between these elements essential to successful climate policy.

  10. Effectiveness and Tradeoffs between Portfolios of Adaptation Strategies Addressing Future Climate and Socioeconomic Uncertainties in California's Central Valley

    NASA Astrophysics Data System (ADS)

    Tansey, M. K.; Van Lienden, B.; Das, T.; Munevar, A.; Young, C. A.; Flores-Lopez, F.; Huntington, J. L.

    2013-12-01

    The Central Valley of California is one of the major agricultural areas in the United States. The Central Valley Project (CVP) is operated by the Bureau of Reclamation to serve multiple purposes including generating approximately 4.3 million gigawatt hours of hydropower and providing, on average, 5 million acre-feet of water per year to irrigate approximately 3 million acres of land in the Sacramento, San Joaquin, and Tulare Lake basins, 600,000 acre-feet per year of water for urban users, and 800,000 acre-feet of annual supplies for environmental purposes. The development of effective adaptation and mitigation strategies requires assessing multiple risks including potential climate changes as well as uncertainties in future socioeconomic conditions. In this study, a scenario-based analytical approach was employed by combining three potential 21st century socioeconomic futures with six representative climate and sea level change projections developed using a transient hybrid delta ensemble method from an archive of 112 bias corrected spatially downscaled CMIP3 global climate model simulations to form 18 future socioeconomic-climate scenarios. To better simulate the effects of climate changes on agricultural water demands, analyses of historical agricultural meteorological station records were employed to develop estimates of future changes in solar radiation and atmospheric humidity from the GCM simulated temperature and precipitation. Projected changes in atmospheric carbon dioxide were computed directly by weighting SRES emissions scenarios included in each representative climate projection. These results were used as inputs to a calibrated crop water use, growth and yield model to simulate the effects of climate changes on the evapotranspiration and yields of major crops grown in the Central Valley. Existing hydrologic, reservoir operations, water quality, hydropower, greenhouse gas (GHG) emissions and both urban and agricultural economic models were integrated

  11. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes

    PubMed Central

    Curtis, Janelle M.R.

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  12. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  13. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.

  14. Scientific uncertainty and the role of expert advice: the case of health checks for coronary heart disease prevention by general practitioners in the UK.

    PubMed

    Florin, D

    1999-11-01

    This paper examines some of the ways in which scientific evidence influenced the development of the policy for the payment of general practitioners for coronary heart disease (CHD) prevention in the UK, in particular the introduction of 'health checks'. The specific policy events which are examined are the 1990 and 1993 contracts for health promotion by general practitioners. Data for this paper were provided by oral history interviews with key informants including general practitioners, public health doctors, civil servants and academics. The study shows the way in which complex scientific evidence interacted with other, professional and political, factors to produce a policy for which there was variable scientific evidence. The relationship between science and policy was complicated and tortuous but two aspects were particularly salient, the way in which scientific uncertainty influenced the content of the policy and the contribution of expert advice to policy making. The existence of social and technical uncertainty about the effectiveness of health checks allowed different players to hold different views depending on their professional affiliation or other agendas. The mechanisms by which scientific advice was given to policy makers were primarily by medical civil servants and through informal contacts and networks. There was no independent systematic formal system to assess and disseminate scientific advice to policy makers, for instance by an expert committee. These factors in turn allowed policy makers to ignore or misinterpret scientific evidence according to other policy imperatives. PMID:10501646

  15. Addressing solar modulation and long-term uncertainties in scaling secondary cosmic rays for in situ cosmogenic nuclide applications [rapid communication

    NASA Astrophysics Data System (ADS)

    Lifton, Nathaniel A.; Bieber, John W.; Clem, John M.; Duldig, Marc L.; Evenson, Paul; Humble, John E.; Pyle, Roger

    2005-10-01

    Solar modulation affects the secondary cosmic rays responsible for in situ cosmogenic nuclide (CN) production the most at the high geomagnetic latitudes to which CN production rates are traditionally referenced. While this has long been recognized (e.g., D. Lal, B. Peters, Cosmic ray produced radioactivity on the Earth, in: K. Sitte (Ed.), Handbuch Der Physik XLVI/2, Springer-Verlag, Berlin, 1967, pp. 551-612 and D. Lal, Theoretically expected variations in the terrestrial cosmic ray production rates of isotopes, in: G.C. Castagnoli (Ed.), Proceedings of the Enrico Fermi International School of Physics 95, Italian Physical Society, Varenna 1988, pp. 216-233), these variations can lead to potentially significant scaling model uncertainties that have not been addressed in detail. These uncertainties include the long-term (millennial-scale) average solar modulation level to which secondary cosmic rays should be referenced, and short-term fluctuations in cosmic ray intensity measurements used to derive published secondary cosmic ray scaling models. We have developed new scaling models for spallogenic nucleons, slow-muon capture and fast-muon interactions that specifically address these uncertainties. Our spallogenic nucleon scaling model, which includes data from portions of 5 solar cycles, explicitly incorporates a measure of solar modulation ( S), and our fast- and slow-muon scaling models (based on more limited data) account for solar modulation effects through increased uncertainties. These models improve on previously published models by better sampling the observed variability in measured cosmic ray intensities as a function of geomagnetic latitude, altitude, and solar activity. Furthermore, placing the spallogenic nucleon data in a consistent time-space framework allows for a more realistic assessment of uncertainties in our model than in earlier ones. We demonstrate here that our models reasonably account for the effects of solar modulation on measured

  16. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated groundwater at Beale Air Force Base in California

    SciTech Connect

    Bogen, K T; Daniels, J I; Hall, L C

    1999-09-01

    This study was designed to accomplish two objectives. The first was to provide to the US Air Force and the regulatory community quantitative procedures that they might want to consider using for addressing uncertainty and variability in exposure to better characterize potential health risk. Such methods could be used at sites where populations may now or in the future be faced with using groundwater contaminated with low concentrations of the chemical trichloroethylene (TCE). The second was to illustrate and explain the application of these procedures with respect to available data for TCE in ground water beneath an inactive landfill site that is undergoing remediation at Beale Air Force Base in California. The results from this illustration provide more detail than the more traditional conservative deterministic, screening-level calculations of risk, also computed for purposes of comparison. Application of the procedures described in this report can lead to more reasonable and equitable risk-acceptability criteria for potentially exposed populations at specific sites.

  17. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated ground water at Beale Air Force Base in California

    SciTech Connect

    Daniels, J I; Bogen, K T; Hall, L C

    1999-10-05

    Conservative deterministic, screening-level calculations of exposure and risk commonly are used in quantitative assessments of potential human-health consequences from contaminants in environmental media. However, these calculations generally are based on multiple upper-bound point estimates of input parameters, particularly for exposure attributes, and can therefore produce results for decision makers that actually overstate the need for costly remediation. Alternatively, a more informative and quantitative characterization of health risk can be obtained by quantifying uncertainty and variability in exposure. This process is illustrated in this report for a hypothetical population at a specific site at Beale Air Force Base in California, where there is trichloroethylene (TCE) contaminated ground water and a potential for future residential use. When uncertainty and variability in exposure were addressed jointly for this case, the 95th-percentile upper-bound value of individual excess lifetime cancer risk was a factor approaching 10 lower than the most conservative deterministic estimate. Additionally, the probability of more than zero additional cases of cancer can be estimated, and in this case it is less than 0.5 for a hypothetical future residential population of up to 26,900 individuals present for any 7.6-y interval of a 70-y time period. Clearly, the results from application of this probabilistic approach can provide reasonable and equitable risk-acceptability criteria for a contaminated site.

  18. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk from Trichloroethylene-Contaminated Ground Water at Beale Air Force Base in California:Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K T

    2001-05-24

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability within a systematic probabilistic framework to integrate the joint effects on risk of distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such a framework was used to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub G}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA{sub c} based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and 10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and 10{sup -4}, respectively. It was estimated that no TCE-related harm is likely to occur due to any plausible residential exposure scenario involving the site. The systematic probabilistic framework illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  19. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K.T.

    1999-09-29

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub g}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and <10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and >10{sup -4}, respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  20. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2011-08-01

    A wide variety of different marine plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. The Marine Model Optimization Testbed is a new software tool designed for rigorous analysis of plankton models in a multi-site 1-D framework, in particular to address uncertainty issues in model assessment. A flexible user interface ensures its suitability to more general inter-comparison, sensitivity and uncertainty analyses, including model comparison at the level of individual processes, and to state estimation for specific locations. The principal features of MarMOT are described and its application to model calibration is demonstrated by way of a set of twin experiments, in which synthetic observations are assimilated in an attempt to recover the true parameter values of a known system. The experimental aim is to investigate the effect of different misfit weighting schemes on parameter recovery in the presence of error in the plankton model's environmental input data. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergences of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error over an annual cycle, indicating

  1. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT 1.1 alpha)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2012-04-01

    A wide variety of different plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. Parameter uncertainty has been widely addressed by calibrating models at data-rich ocean sites. However, relatively little attention has been given to quantifying uncertainty in the physical fields required by the plankton models at these sites, and tendencies in the biogeochemical properties due to the effects of horizontal processes are often neglected. Here we use model twin experiments, in which synthetic data are assimilated to estimate a system's known "true" parameters, to investigate the impact of error in a plankton model's environmental input data. The experiments are supported by a new software tool, the Marine Model Optimization Testbed, designed for rigorous analysis of plankton models in a multi-site 1-D framework. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergence tendencies of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error variance over an annual cycle, indicating variation in the significance attributable to individual model-data differences. An inverse scheme using ensemble-based estimates of the

  2. Science and Theatre Education: A Cross-disciplinary Approach of Scientific Ideas Addressed to Student Teachers of Early Childhood Education

    NASA Astrophysics Data System (ADS)

    Tselfes, Vasilis; Paroussi, Antigoni

    2009-09-01

    There is, in Greece, an ongoing attempt to breach the boundaries established between the different teaching-learning subjects of compulsory education. In this context, we are interested in exploring to what degree the teaching and learning of ideas from the sciences’ “internal life” (Hacking, in: Pickering (ed) Science as practice and culture, 1992) benefits from creatively coming into contact with theatrical education as part of the corresponding curriculum subject. To this end, 57 students of the Early Childhood Education Department of the University of Athens were called to study extracts from Galileo’s Dialogue Concerning the Two Chief World Systems, Ptolemaic and Copernican, to focus on a subject that the Dialogue’s “interlocutors” forcefully disagree about and to theatrically represent (using shadow theatre techniques) what they considered as being the central idea of this clash of opinions. The results indicate that this attempt leads to a satisfactory understanding of ideas relating to the content and methodology of the natural sciences. At the same time, theatrical education avails itself of the representation of scientific ideas and avoids the clichés and hackneyed techniques that the (often) simplistic choices available in the educational context of early childhood education tend towards. The basic reasons for both facets of this success are: (a) Genuine scientific texts force the students to approach them with seriousness, and all the more so if these recount the manner in which scientific ideas are produced and are embedded in the historical and social context of the age that created them; (b) The theatrical framework, which essentially guides the students’ activities, allows (if not obliges) them to approach scientific issues creatively; in other words, it allows them to create something related to science and recognize it as theirs; and, (c) Both the narrative texts describing processes of “science making” (Bruner, J Sci Educ

  3. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  4. Scientific uncertainty as a moderator of the relationship between descriptive norm and intentions to engage in cancer risk-reducing behaviors.

    PubMed

    Kim, Hye Kyung; Kim, Sooyeon; Niederdeppe, Jeff

    2015-04-01

    This study examined motivational factors underlying six behaviors with varying levels of scientific uncertainty with regard to their effectiveness in reducing cancer risk. Making use of considerable within-subjects variation, the authors examined the moderating role of the degree of scientific uncertainty about the effectiveness of cancer risk-reducing behaviors in shaping relationships between constructs in the Integrative Model of Behavioral Prediction (Fishbein & Yzer, 2003 ). Using cross-sectional data (n = 601), the descriptive norm-intention relationship was stronger for scientifically uncertain behaviors such as avoiding BPA plastics and using a hands-free mobile phone headset than for established behaviors (e.g., avoiding smoking, fruit and vegetable intake, exercise, and applying sunscreen). This pattern was partially explained by the mediating role of injunctive norms between descriptive norm and intentions, as predicted by the extended Theory of Normative Social Behavior (Rimal, 2008 ). For behaviors more clearly established as an effective means to reduce the risk of cancer, self-efficacy was significantly more predictive of intentions to perform such behaviors. The authors discuss practical implications of these findings and theoretical insights into better understanding the role of normative components in the adaptation of risk-reduction behaviors. PMID:25730742

  5. Identification and evaluation of scientific uncertainties related to fish and aquatic resources in the Colorado River, Grand Canyon - summary and interpretation of an expert-elicitation questionnaire

    USGS Publications Warehouse

    Kennedy, Theodore A.

    2013-01-01

    Identifying areas of scientific uncertainty is a critical step in the adaptive management process (Walters, 1986; Runge, Converse, and Lyons, 2011). To identify key areas of scientific uncertainty regarding biologic resources of importance to the Glen Canyon Dam Adaptive Management Program, the Grand Canyon Monitoring and Research Center (GCMRC) convened Knowledge Assessment Workshops in May and July 2005. One of the products of these workshops was a set of strategic science questions that highlighted key areas of scientific uncertainty. These questions were intended to frame and guide the research and monitoring activities conducted by the GCMRC in subsequent years. Questions were developed collaboratively by scientists and managers. The questions were not all of equal importance or merit—some questions were large scale and others were small scale. Nevertheless, these questions were adopted and have guided the research and monitoring efforts conducted by the GCMRC since 2005. A new round of Knowledge Assessment Workshops was convened by the GCMRC in June and October 2011 and January 2012 to determine whether the research and monitoring activities conducted since 2005 had successfully answered some of the strategic science questions. Oral presentations by scientists highlighting research findings were a centerpiece of all three of the 2011–12 workshops. Each presenter was also asked to provide an answer to the strategic science questions that were specific to the presenter’s research area. One limitation of this approach is that these answers represented the views of the handful of scientists who developed the presentations, and, as such, they did not incorporate other perspectives. Thus, the answers provided by presenters at the Knowledge Assessment Workshops may not have accurately captured the sentiments of the broader group of scientists involved in research and monitoring of the Colorado River in Glen and Grand Canyons. Yet a fundamental ingredient of

  6. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter

  7. Development of a physiologically-based pharmacokinetic model of 2-phenoxyethanol and its metabolite phenoxyacetic acid in rats and humans to address toxicokinetic uncertainty in risk assessment.

    PubMed

    Troutman, John A; Rick, David L; Stuard, Sharon B; Fisher, Jeffrey; Bartels, Michael J

    2015-11-01

    2-Phenoxyethanol (PhE) has been shown to induce hepatotoxicity, renal toxicity, and hemolysis at dosages ≥ 400 mg/kg/day in subchronic and chronic studies in multiple species. To reduce uncertainty associated with interspecies extrapolations and to evaluate the margin of exposure (MOE) for use of PhE in cosmetics and baby products, a physiologically-based pharmacokinetic (PBPK) model of PhE and its metabolite 2-phenoxyacetic acid (PhAA) was developed. The PBPK model incorporated key kinetic processes describing the absorption, distribution, metabolism and excretion of PhE and PhAA following oral and dermal exposures. Simulations of repeat dose rat studies facilitated the selection of systemic AUC as the appropriate dose metric for evaluating internal exposures to PhE and PhAA in rats and humans. Use of the PBPK model resulted in refinement of the total default UF for extrapolation of the animal data to humans from 100 to 25. Based on very conservative assumptions for product composition and aggregate product use, model-predicted exposures to PhE and PhAA resulting from adult and infant exposures to cosmetic products are significantly below the internal dose of PhE observed at the NOAEL dose in rats. Calculated MOEs for all exposure scenarios were above the PBPK-refined UF of 25. PMID:26188115

  8. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  9. Opening address

    NASA Astrophysics Data System (ADS)

    Castagnoli, C.

    1994-01-01

    Ladies and Gentlemen My cordial thanks to you for participating in our workshop and to all those who have sponsored it. When in 1957 I attended the International Congress on Fundamental Constants held in Turin on the occasion of the first centenary of the death of Amedeo Avogadro, I did not expect that about thirty-five years later a small but representative number of distinguished scientists would meet here again, to discuss how to go beyond the sixth decimal figure of the Avogadro constant. At that time, the uncertainty of the value of this constant was linked to the fourth decimal figure, as reported in the book by DuMond and Cohen. The progress made in the meantime is universally acknowledged to be due to the discovery of x-ray interferometry. We are honoured that one of the two founding fathers, Prof. Ulrich Bonse, is here with us, but we regret that the other, Prof. Michael Hart, is not present. After Bonse and Hart's discovery, the x-ray crystal density method triggered, as in a chain reaction, the investigation of two other quantities related to the Avogadro constant—density and molar mass. Scientists became, so to speak, resonant and since then have directed their efforts, just to mention a few examples, to producing near-perfect silicon spheres and determining their density, to calibrating, with increasing accuracy, mass spectrometers, and to studying the degree of homogeneity of silicon specimens. Obviously, I do not need to explain to you why the Avogadro constant is important. I wish, however, to underline that it is not only because of its position among fundamental constants, as we all know very well its direct links with the fine structure constant, the Boltzmann and Faraday constants, the h/e ratio, but also because when a new value of NA is obtained, the whole structure of the fundamental constants is shaken to a lesser or greater extent. Let me also remind you that the second part of the title of this workshop concerns the silicon

  10. Opening Address

    NASA Astrophysics Data System (ADS)

    Yamada, T.

    2014-12-01

    related fields such as nuclear astrophysics, hypernuclear physics, hadron physics, and condensate matter physics so on. In fact, in this workshop, we also discuss the clustering aspects in the related fields. Thus, I expect in this workshop we can grasp the present status of the nuclear cluster physics and demonstrate its perspective in near future. This workshop is sponsored by several institutes and organizations. In particular, I would express our thanks for financial supports to Research Center for Nuclear Physics (RCNP), Osaka University, Center for Nuclear Study (CNS), University of Tokyo, Joint Institute for Computational Fundamental Science (JICFuS), and RIKEN Nishina Center for Accelerator- Based Science. They are cohosting this workshop. I would like also to appreciate my University, Kanto Gakuin University, who offers this nice place for one week and helps us to hold this workshop smoothly and conveniently. Today, the president of my University, Prof. Kuku, is here to present a welcome address. Thank you very much. Finally, with many of the participants leading this field both in theory and in experiment, we wish this workshop offers an opportunity to simulate communications not only during the workshop but also in the future. In addition, we hope you enjoy exploring city of Yokohama and the area around, as well as scientific discussions. Thank you very much for your attention.

  11. Coping with Uncertainty.

    ERIC Educational Resources Information Center

    Wargo, John

    1985-01-01

    Draws conclusions on the scientific uncertainty surrounding most chemical use regulatory decisions, examining the evolution of law and science, benefit analysis, and improving information. Suggests: (1) rapid development of knowledge of chemical risks and (2) a regulatory system which is flexible to new scientific knowledge. (DH)

  12. Uncertainty and global climate change research

    SciTech Connect

    Tonn, B.E.; Weiher, R.

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  13. Scientific millenarianism

    SciTech Connect

    Weinberg, A.M.

    1997-12-01

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO{sub 2} warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper.

  14. Addressing healthcare.

    PubMed

    Daly, Rich

    2013-02-11

    Though President Barack Obama has rarely made healthcare references in his State of the Union addresses, health policy experts are hoping he changes that strategy this year. "The question is: Will he say anything? You would hope that he would, given that that was the major issue he started his presidency with," says Dr. James Weinstein, left, of the Dartmouth-Hitchcock health system. PMID:23487896

  15. Uncertainty in Measured Data and Model Predictions: Essential Components for Mobilizing Environmental Data and Modeling

    NASA Astrophysics Data System (ADS)

    Harmel, D.

    2014-12-01

    In spite of pleas for uncertainty analysis - such as Beven's (2006) "Should it not be required that every paper in both field and modeling studies attempt to evaluate the uncertainty in the results?" - the uncertainty associated with hydrology and water quality data is rarely quantified and rarely considered in model evaluation. This oversight, justified in the past by mainly tenuous philosophical concerns, diminishes the value of measured data and ignores the environmental and socio-economic benefits of improved decisions and policies based on data with estimated uncertainty. This oversight extends to researchers, who typically fail to estimate uncertainty in measured discharge and water quality data because of additional effort required, lack of adequate scientific understanding on the subject, and fear of negative perception if data with "high" uncertainty are reported; however, the benefits are certain. Furthermore, researchers have a responsibility for scientific integrity in reporting what is known and what is unknown, including the quality of measured data. In response we produced an uncertainty estimation framework and the first cumulative uncertainty estimates for measured water quality data (Harmel et al., 2006). From that framework, DUET-H/WQ was developed (Harmel et al., 2009). Application to several real-world data sets indicated that substantial uncertainty can be contributed by each data collection procedural category and that uncertainties typically occur in order discharge < sediment < dissolved N and P < total N and P. Similarly, modelers address certain aspects of model uncertainty but ignore others, such as the impact of uncertainty in discharge and water quality data. Thus, we developed methods to incorporate prediction uncertainty as well as calibration/validation data uncertainty into model goodness-of-fit evaluation (Harmel and Smith, 2007; Harmel et al., 2010). These enhance model evaluation by: appropriately sharing burden with "data

  16. Benefits of dealing with uncertainty in greenhouse gas inventories: introduction

    SciTech Connect

    Jonas, Matthias; Winiwarter, Wilfried; Marland, Gregg; White, Thomas; Nahorski, Zbigniew; Bun, Rostyslav

    2010-01-01

    The assessment of greenhouse gases emitted to and removed from the atmosphere is high on the international political and scientific agendas. Growing international concern and cooperation regarding the climate change problem have increased the need for policy-oriented solutions to the issue of uncertainty in, and related to, inventories of greenhouse gas (GHG) emissions. The approaches to addressing uncertainty discussed in this Special Issue reflect attempts to improve national inventories, not only for their own sake but also from a wider, systems analytical perspective-a perspective that seeks to strengthen the usefulness of national inventories under a compliance and/or global monitoring and reporting framework. These approaches demonstrate the benefits of including inventory uncertainty in policy analyses. The authors of the contributed papers show that considering uncertainty helps avoid situations that can, for example, create a false sense of certainty or lead to invalid views of subsystems. This may eventually prevent related errors from showing up in analyses. However, considering uncertainty does not come for free. Proper treatment of uncertainty is costly and demanding because it forces us to make the step from 'simple to complex' and only then to discuss potential simplifications. Finally, comprehensive treatment of uncertainty does not offer policymakers quick and easy solutions. The authors of the papers in this Special Issue do, however, agree that uncertainty analysis must be a key component of national GHG inventory analysis. Uncertainty analysis helps to provide a greater understanding and better science helps us to reduce and deal with uncertainty. By recognizing the importance of identifying and quantifying uncertainties, great strides can be made in ongoing discussions regarding GHG inventories and accounting for climate change. The 17 papers in this Special Issue deal with many aspects of analyzing and dealing with uncertainty in emissions

  17. État des connaissances et incertitudes sur le changement climatique induit par les activités humainesScientific basis and uncertainties of human induced climate change

    NASA Astrophysics Data System (ADS)

    Duplessy, Jean-Claude

    2001-12-01

    During the 20th century, the mean temperature of the air at the ground level has increased by 0.6±0.2 °C and the warmest air temperatures occurred after 1980. These were significantly warmer than those of the last millennium. Simultaneously, rain and drought, cold and heat wave frequencies have changed, mountain glaciers retreated and the sea-level increased by ˜10 cm. This warming was at least in part induced by human activities and will continue during the next decades. Its amplitude will depend on the rate of greenhouse gas and sulphate aerosols emissions, i.e. on energetic scenarios. Pending scientific uncertainties include cloud variations and interactions between the physical parts of the climate system and the biogeochemical cycles and the biosphere.

  18. Inaugural address

    NASA Astrophysics Data System (ADS)

    Joshi, P. S.

    2014-03-01

    From jets to cosmos to cosmic censorship P S Joshi Tata Institute of Fundamental Research, Homi Bhabha Road, Colaba, Mumbai 400005, India E-mail: psj@tifr.res.in 1. Introduction At the outset, I should like to acknowledge that part of the title above, which tries to capture the main flavour of this meeting, and has been borrowed from one of the plenary talks at the conference. When we set out to make the programme for the conference, we thought of beginning with observations on the Universe, but then we certainly wanted to go further and address deeper questions, which were at the very foundations of our inquiry, and understanding on the nature and structure of the Universe. I believe, we succeeded to a good extent, and it is all here for you in the form of these Conference Proceedings, which have been aptly titled as 'Vishwa Mimansa', which could be possibly translated as 'Analysis of the Universe'! It is my great pleasure and privilege to welcome you all to the ICGC-2011 meeting at Goa. The International Conference on Gravitation and Cosmology (ICGC) series of meetings are being organized by the Indian Association for General Relativity and Gravitation (IAGRG), and the first such meeting was planned and conducted in Goa in 1987, with subsequent meetings taking place at a duration of about four years at various locations in India. So, it was thought appropriate to return to Goa to celebrate the 25 years of the ICGC meetings. The recollections from that first meeting have been recorded elsewhere here in these Proceedings. The research and teaching on gravitation and cosmology was initiated quite early in India, by V V Narlikar at the Banares Hindu University, and by N R Sen in Kolkata in the 1930s. In course of time, this activity grew and gained momentum, and in early 1969, at the felicitation held for the 60 years of V V Narlikar at a conference in Ahmedabad, P C Vaidya proposed the formation of the IAGRG society, with V V Narlikar being the first President. This

  19. Opening addresses.

    PubMed

    Chukudebelu, W O; Lucas, A O; Ransome-kuti, O; Akinla, O; Obayi, G U

    1988-01-01

    The theme of the 3rd International Conference of the Society of Gynecology and Obstetrics of Nigeria (SOGON) held October 26, 1986 in Enugu was maternal morbidity and mortality in Africa. The opening addresses emphasize the high maternal mortality rate in Africa and SOGON's dedication to promoting women's health and welfare. In order to reduce maternal mortality, the scope of this problem must be made evident by gathering accurate mortality rates through maternity care monitoring and auditing. Governments, health professionals, educators, behavioral scientists, and communication specialists have a responsibility to improve maternal health services in this country. By making the population aware of this problem through education, measures can be taken to reduce the presently high maternal mortality rates. Nigerian women are physically unprepared for childbirth; therefore, balanced diets and disease prevention should be promoted. Since about 40% of deliveries are unmanaged, training for traditional birth attendants should be provided. Furthermore, family planning programs should discourage teenage pregnancies, encourage birth spacing and small families, and promote the use of family planning techniques among men. The problem of child bearing and rearing accompanied by hard work should also be investigated. For practices to change so that maternal mortality rates can be reduced, attitudes must be changed such that the current rates are viewed as unacceptable. PMID:12179275

  20. Presidential address.

    PubMed

    Vohra, U

    1993-07-01

    The Secretary of India's Ministry of Health and Family Welfare serves as Chair of the Executive Council of the International Institute for Population Sciences in Bombay. She addressed its 35th convocation in 1993. Global population stands at 5.43 billion and increases by about 90 million people each year. 84 million of these new people are born in developing countries. India contributes 17 million new people annually. The annual population growth rate in India is about 2%. Its population size will probably surpass 1 billion by the 2000. High population growth rates are a leading obstacle to socioeconomic development in developing countries. Governments of many developing countries recognize this problem and have expanded their family planning programs to stabilize population growth. Asian countries that have done so and have completed the fertility transition include China, Japan, Singapore, South Korea, and Thailand. Burma, Malaysia, North Korea, Sri Lanka, and Vietnam have not yet completed the transition. Afghanistan, Bangladesh, Iran, Nepal, and Pakistan are half-way through the transition. High population growth rates put pressure on land by fragmenting finite land resources, increasing the number of landless laborers and unemployment, and by causing considerable rural-urban migration. All these factors bring about social stress and burden civic services. India has reduced its total fertility rate from 5.2 to 3.9 between 1971 and 1991. Some Indian states have already achieved replacement fertility. Considerable disparity in socioeconomic development exists among states and districts. For example, the states of Bihar, Madhya Pradesh, Rajasthan, and Uttar Pradesh have female literacy rates lower than 27%, while that for Kerala is 87%. Overall, infant mortality has fallen from 110 to 80 between 1981 and 1990. In Uttar Pradesh, it has fallen from 150 to 98, while it is at 17 in Kerala. India needs innovative approaches to increase contraceptive prevalence rates

  1. Welcome Address

    NASA Astrophysics Data System (ADS)

    Kiku, H.

    2014-12-01

    Ladies and Gentlemen, It is an honor for me to present my welcome address in the 3rd International Workshop on "State of the Art in Nuclear Cluster Physics"(SOTANCP3), as the president of Kanto Gakuin University. Particularly to those from abroad more than 17 countries, I am very grateful for your participation after long long trips from your home to Yokohama. On the behalf of the Kanto Gakuin University, we certainly welcome your visit to our university and stay in Yokohama. First I would like to introduce Kanto Gakuin University briefly. Kanto Gakuin University, which is called KGU, traces its roots back to the Yokohama Baptist Seminary founded in 1884 in Yamate, Yokohama. The seminary's founder was Albert Arnold Bennett, alumnus of Brown University, who came to Japan from the United States to establish a theological seminary for cultivating and training Japanese missionaries. Now KGU is a major member of the Kanto Gakuin School Corporation, which is composed of two kindergartens, two primary schools, two junior high schools, two senior high schools as well as KGU. In this university, we have eight faculties with graduate school including Humanities, Economics, Law, Sciences and Engineering, Architecture and Environmental Design, Human and Environmental Studies, Nursing, and Law School. Over eleven thousands students are currently learning in our university. By the way, my major is the geotechnical engineering, and I belong to the faculty of Sciences and Engineering in my university. Prof. T. Yamada, here, is my colleague in the same faculty. I know that the nuclear physics is one of the most active academic fields in the world. In fact, about half of the participants, namely, more than 50 scientists, come from abroad in this conference. Moreover, I know that the nuclear physics is related to not only the other fundamental physics such as the elementary particle physics and astrophysics but also chemistry, medical sciences, medical cares, and radiation metrology

  2. Are models, uncertainty, and dispute resolution compatible?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  3. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  4. Some Aspects of uncertainty in computational fluid dynamics results

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.

    1991-01-01

    Uncertainties are inherent in computational fluid dynamics (CFD). These uncertainties need to be systematically addressed and managed. Sources of these uncertainty analysis are discussed. Some recommendations are made for quantification of CFD uncertainties. A practical method of uncertainty analysis is based on sensitivity analysis. When CFD is used to design fluid dynamic systems, sensitivity-uncertainty analysis is essential.

  5. Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics

    PubMed Central

    Djulbegovic, Benjamin

    2011-01-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885

  6. History Forum Addresses Creation/Evolution Controversy.

    ERIC Educational Resources Information Center

    Schweinsberg, John

    1997-01-01

    A series of programs entitled Creationism and Evolution: The History of a Controversy was presented at the University of Alabama in Huntsville. The controversy was addressed from an historical and sociological, rather than a scientific perspective. Speakers addressed the evolution of scientific creationism, ancient texts versus sedimentary rocks…

  7. Hydrology, society, change and uncertainty

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  8. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  9. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  10. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    NASA Astrophysics Data System (ADS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-03-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found.

  11. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  12. Uncertainties in large space systems

    NASA Technical Reports Server (NTRS)

    Fuh, Jon-Shen

    1988-01-01

    Uncertainties of a large space system (LSS) can be deterministic or stochastic in nature. The former may result in, for example, an energy spillover problem by which the interaction between unmodeled modes and controls may cause system instability. The stochastic uncertainties are responsible for mode localization and estimation errors, etc. We will address the effects of uncertainties on structural model formulation, use of available test data to verify and modify analytical models before orbiting, and how the system model can be further improved in the on-orbit environment.

  13. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  15. Is Current Hydrogeologic Research Addressing Long-TermPredictions?

    SciTech Connect

    Tsang, Chin-Fu

    2004-09-10

    Hydrogeology is a field closely related to the needs of society. Many problems of current national and local interest require predictions of hydrogeological system behavior, and, in a number of important cases, the period of prediction is tens to hundreds of thousands of years. It is argued that the demand for such long-term hydrogeological predictions casts a new light on the future needs of hydrogeological research. Key scientific issues are no longer concerned only with simple processes or narrowly focused modeling or testing methods, but also with assessment of prediction uncertainties and confidence, couplings among multiple physico-chemical processes occurring simultaneously at a site, and the interplay between site characterization and predictive modeling. These considerations also have significant implications for hydrogeological education. With this view, it is asserted that hydrogeological directions and education need to be reexamined and possibly refocused to address specific needs for long-term predictions.

  16. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  17. The ends of uncertainty: Air quality science and planning in Central California

    SciTech Connect

    Fine, James

    2003-09-01

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed

  18. Awards and Addresses Summary

    PubMed Central

    2008-01-01

    Each year at the annual ASHG meeting, addresses are given in honor of the society and a number of award winners. A summary of each of these addresses is given below. On the next pages, we have printed the Presidential Address and the addresses for the William Allan Award. The other addresses, accompanied by pictures of the speakers, can be found at www.ashg.org.

  19. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pommé, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as ‘easy’ and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  20. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  1. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. PMID:18653943

  2. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1971-01-01

    Satellite altitude determination uncertainties are discussed from the standpoint of the GEOS-C satellite. GEOS-C will be tracked by a number of the conventional satellite tracking systems, as well as by two advanced systems; a satellite-to-satellite tracking system and lasers capable of decimeter accuracies which are being developed in connection with the Goddard Earth and Ocean Dynamics Applications program. The discussion is organized in terms of a specific type of GEOS-C orbit which would satisfy a number of scientific objectives including the study of the gravitational field by means of both the altimeter and the satellite-to-satellite tracking system, studies of tides, and the Gulf Stream meanders.

  3. Scientific Misconduct.

    PubMed

    Gross, Charles

    2016-01-01

    Scientific misconduct has been defined as fabrication, falsification, and plagiarism. Scientific misconduct has occurred throughout the history of science. The US government began to take systematic interest in such misconduct in the 1980s. Since then, a number of studies have examined how frequently individual scientists have observed scientific misconduct or were involved in it. Although the studies vary considerably in their methodology and in the nature and size of their samples, in most studies at least 10% of the scientists sampled reported having observed scientific misconduct. In addition to studies of the incidence of scientific misconduct, this review considers the recent increase in paper retractions, the role of social media in scientific ethics, several instructional examples of egregious scientific misconduct, and potential methods to reduce research misconduct. PMID:26273897

  4. The COST 731 Action: A review on uncertainty propagation in advanced hydro-meteorological forecast systems

    NASA Astrophysics Data System (ADS)

    Rossa, Andrea; Liechti, Katharina; Zappa, Massimiliano; Bruen, Michael; Germann, Urs; Haase, Günther; Keil, Christian; Krahe, Peter

    2011-05-01

    Quantifying uncertainty in flood forecasting is a difficult task, given the multiple and strongly non-linear model components involved in such a system. Much effort has been and is being invested in the quest of dealing with uncertain precipitation observations and forecasts and the propagation of such uncertainties through hydrological and hydraulic models predicting river discharges and risk for inundation. The COST 731 Action is one of these and constitutes a European initiative which deals with the quantification of forecast uncertainty in hydro-meteorological forecast systems. COST 731 addresses three major lines of development: (1) combining meteorological and hydrological models to form a forecast chain, (2) propagating uncertainty information through this chain and make it available to end users in a suitable form, (3) advancing high-resolution numerical weather prediction precipitation forecasts by using non-conventional observations from, for instance, radar to determine details in the initial conditions on scales smaller than what can be resolved by conventional observing systems. Recognizing the interdisciplinarity of the challenge COST 731 has organized its work forming Working Groups at the interfaces between the different scientific disciplines involved, i.e. between observation and atmospheric (and hydrological) modelling (WG-1), between atmospheric and hydrologic modelling (WG-2) and between hydrologic modelling and end-users (WG-3). This paper summarizes the COST 731 activities and its context, provides a review of the recent progress made in dealing with uncertainties in flood forecasting, and sets the scene for the papers of this Thematic Issue. In particular, a bibliometric analysis highlights the strong recent increase in addressing the uncertainty analysis in flood forecasting from an integrated perspective. Such a perspective necessarily involves the area of meteorology, hydrology, and decision making in order to take operational advantage

  5. Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions

    SciTech Connect

    Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.

    2011-09-30

    This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.

  6. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  7. Initiative Addresses Subsurface Energy and Environment Problems

    NASA Astrophysics Data System (ADS)

    Bodvarsson, Gudmundur S.; Majer, Ernest L.; Wang, Joseph S. Y.; Colwell, Frederick; Redden, George

    2006-01-01

    Members of the geoscience community are cooperating in conceptualizing fundamental, crosscutting research to address major obstacles to solving energy and environmental problems related to the subsurface, through the SECUREarth initiative, which began in 2004. Addressing problems, such as reliable nuclear waste storage and safe carbon dioxide (CO2) sequestration, are critical to maintaining an economical and safe energy supply and clean environment. A recent workshop in Golden, Colo., helped to further the development of the SECUREarth (Scientific Energy/Environmental Crosscutting Underground Research for Urgent Solutions to Secure the Earth's Future) initiative by identifying the key scientific challenges in the geosciences, as well as to target possible approaches for overcoming roadblocks.

  8. A Probabilistic Approach for Analysis of Modeling Uncertainties in Quantification of Trading Ratios in Nonpoint to Point Source Nutrient Trading Programs

    NASA Astrophysics Data System (ADS)

    Tasdighi, A.; Arabi, M.

    2015-12-01

    Quantifying the nonpoint source pollutant loads and assessing the water quality benefits of conservation practices (BMPs) are prone to different types of uncertainties which have to be taken into account when developing nutrient trading programs. Although various types of modeling uncertainties (parameter, input and structure) have been examined in the literature more or less, the impact of modeling uncertainties on evaluation of BMPs has not been addressed sufficiently. Currently, "trading ratios" are used within nutrient trading programs to account for variability of nonpoint source loads. However, we were not able to find any case of some rigorous scientific approach to account for any type of uncertainties in trading ratios. In this study, Bayesian inferences were applied to incorporate input, parameter and structural uncertainties using a statistically valid likelihood function. IPEAT (Integrated Parameter Estimation and Uncertainty Analysis Tool), a framework developed for simultaneous evaluation of parameterization, input data, model structure, and observation data uncertainty and their contribution to predictive uncertainty was used to quantify the uncertainties in effectiveness of agricultural BMPs while propagating different sources of uncertainty. SWAT was used as the simulation model. SWAT parameterization was done for three different model structures (SCS CN I, SCS CN II and G&A methods) using a Bayesian based Markov Chain Monte Carlo (MCMC) method named Differential Evolution Adaptive Metropolis (DREAM). For each model structure, the Integrated Bayesian Uncertainty Estimator (IBUNE) was employed to generate latent variables from input data. Bayesian Model Averaging (BMA) was then used to combine the models and Expectation-Maximization (EM) optimization technique was used to estimate the BMA weights. Using this framework, the impact of different sources of uncertainty on nutrient loads from nonpoint sources and subsequently effectiveness of BMPs in

  9. Messaging climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Cooke, Roger M.

    2015-01-01

    Climate change is full of uncertainty and the messengers of climate science are not getting the uncertainty narrative right. To communicate uncertainty one must first understand it, and then avoid repeating the mistakes of the past.

  10. Exploring uncertainty in the Earth Sciences - the potential field perspective

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Blakely, R. J.

    2013-12-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  11. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  12. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  13. A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty

    NASA Astrophysics Data System (ADS)

    Vezer, M. A.

    2009-12-01

    The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of

  14. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  15. ICYESS 2013: Understanding and Interpreting Uncertainty

    NASA Astrophysics Data System (ADS)

    Rauser, F.; Niederdrenk, L.; Schemann, V.; Schmidt, A.; Suesser, D.; Sonntag, S.

    2013-12-01

    We will report the outcomes and highlights of the Interdisciplinary Conference of Young Earth System Scientists (ICYESS) on Understanding and Interpreting Uncertainty in September 2013, Hamburg, Germany. This conference is aimed at early career scientists (Masters to Postdocs) from a large variety of scientific disciplines and backgrounds (natural, social and political sciences) and will enable 3 days of discussions on a variety of uncertainty-related aspects: 1) How do we deal with implicit and explicit uncertainty in our daily scientific work? What is uncertain for us, and for which reasons? 2) How can we communicate these uncertainties to other disciplines? E.g., is uncertainty in cloud parameterization and respectively equilibrium climate sensitivity a concept that is understood equally well in natural and social sciences that deal with Earth System questions? Or vice versa, is, e.g., normative uncertainty as in choosing a discount rate relevant for natural scientists? How can those uncertainties be reconciled? 3) How can science communicate this uncertainty to the public? Is it useful at all? How are the different possible measures of uncertainty understood in different realms of public discourse? Basically, we want to learn from all disciplines that work together in the broad Earth System Science community how to understand and interpret uncertainty - and then transfer this understanding to the problem of how to communicate with the public, or its different layers / agents. ICYESS is structured in a way that participation is only possible via presentation, so every participant will give their own professional input into how the respective disciplines deal with uncertainty. Additionally, a large focus is put onto communication techniques; there are no 'standard presentations' in ICYESS. Keynote lectures by renowned scientists and discussions will lead to a deeper interdisciplinary understanding of what we do not really know, and how to deal with it. Many

  16. Scientific Fraud.

    ERIC Educational Resources Information Center

    Goodstein, David

    1991-01-01

    A discussion of fraud in the presentation of results of scientific research cites cases looks at variations in the degree of misrepresentation, kinds and intents of fraud, attention given by public agencies (National Institutes of Health, National Science Foundation, Public Health Service), and differences between scientific and civil fraud. (MSE)

  17. Davis-Besse uncertainty study

    SciTech Connect

    Davis, C B

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.

  18. Evaluating Uncertainty to Strengthen Epidemiologic Data for Use in Human Health Risk Assessments

    PubMed Central

    Burns, Carol J.; Wright, J. Michael; Bateson, Thomas F.; Burstyn, Igor; Goldstein, Daniel A.; Klaunig, James E.; Luben, Thomas J.; Mihlan, Gary; Ritter, Leonard; Schnatter, A. Robert; Symons, J. Morel; Don Yi, Kun

    2014-01-01

    Background: There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. Although there is uncertainty associated with the results of most epidemiologic studies, techniques exist to characterize uncertainty that can be applied to improve weight-of-evidence evaluations and risk characterization efforts. Methods: This report derives from a Health and Environmental Sciences Institute (HESI) workshop held in Research Triangle Park, North Carolina, to discuss the utility of using epidemiologic data in risk assessments, including the use of advanced analytic methods to address sources of uncertainty. Epidemiologists, toxicologists, and risk assessors from academia, government, and industry convened to discuss uncertainty, exposure assessment, and application of analytic methods to address these challenges. Synthesis: Several recommendations emerged to help improve the utility of epidemiologic data in risk assessment. For example, improved characterization of uncertainty is needed to allow risk assessors to quantitatively assess potential sources of bias. Data are needed to facilitate this quantitative analysis, and interdisciplinary approaches will help ensure that sufficient information is collected for a thorough uncertainty evaluation. Advanced analytic methods and tools such as directed acyclic graphs (DAGs) and Bayesian statistical techniques can provide important insights and support interpretation of epidemiologic data. Conclusions: The discussions and recommendations from this workshop demonstrate that there are practical steps that the scientific community can adopt to strengthen epidemiologic data for decision making. Citation: Burns CJ, Wright JM, Pierson JB, Bateson TF, Burstyn I, Goldstein DA, Klaunig JE, Luben TJ, Mihlan G, Ritter L, Schnatter AR, Symons JM, Yi KD. 2014. Evaluating uncertainty to strengthen

  19. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  20. Scientific Globish versus scientific English.

    PubMed

    Tychinin, Dmitry N; Kamnev, Alexander A

    2013-10-01

    The proposed adoption of 'scientific Globish' as a simplified language standard for scholarly communication may appeal to authors who have difficulty with English proficiency. However, Globish might not justify the hopes being pinned on it and might open the door to further deterioration of the quality of English-language scientific writing. PMID:23928006

  1. Dissociating Uncertainty Responses and Reinforcement Signals in the Comparative Study of Uncertainty Monitoring

    ERIC Educational Resources Information Center

    Smith, J. David; Redford, Joshua S.; Beran, Michael J.; Washburn, David A.

    2006-01-01

    Although researchers are exploring animals' capacity for monitoring their states of uncertainty, the use of some paradigms allows the criticism that animals map avoidance responses to error-causing stimuli not because of uncertainty monitored but because of feedback signals and stimulus aversion. The authors addressed this criticism with an…

  2. Addressivity in cogenerative dialogues

    NASA Astrophysics Data System (ADS)

    Hsu, Pei-Ling

    2014-03-01

    Ashraf Shady's paper provides a first-hand reflection on how a foreign teacher used cogens as culturally adaptive pedagogy to address cultural misalignments with students. In this paper, Shady drew on several cogen sessions to showcase his journey of using different forms of cogens with his students. To improve the quality of cogens, one strategy he used was to adjust the number of participants in cogens. As a result, some cogens worked and others did not. During the course of reading his paper, I was impressed by his creative and flexible use of cogens and at the same time was intrigued by the question of why some cogens work and not others. In searching for an answer, I found that Mikhail Bakhtin's dialogism, especially the concept of addressivity, provides a comprehensive framework to address this question. In this commentary, I reanalyze the cogen episodes described in Shady's paper in the light of dialogism. My analysis suggests that addressivity plays an important role in mediating the success of cogens. Cogens with high addressivity function as internally persuasive discourse that allows diverse consciousnesses to coexist and so likely affords productive dialogues. The implications of addressivity in teaching and learning are further discussed.

  3. Addressing the Creationist Challenge.

    ERIC Educational Resources Information Center

    Seaford, H. Wade, Jr.

    1990-01-01

    Describes a method of contrasting "scientific creationism" and evolution, or pseudo-science and science, that was utilized in a freshman seminar at Dickinson College. Discusses how the seminar format fostered analytical thinking, research, and writing skills. Presents responses given by creationist students after the course. (JS)

  4. AMCA Presidential Address

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The American Mosquito Control Association and mosquito control will be discussed. The American Mosquito Control Association in a non-profit scientific organization dedicated to promoting the highest standard in professional mosquito control. It is comprised of more than 1500 members representing st...

  5. Analysis of Infiltration Uncertainty

    SciTech Connect

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the

  6. Sources Sought for Innovative Scientific Instrumentation for Scientific Lunar Rovers

    NASA Technical Reports Server (NTRS)

    Meyer, C.

    1993-01-01

    Lunar rovers should be designed as integrated scientific measurement systems that address scientific goals as their main objective. Scientific goals for lunar rovers are presented. Teleoperated robotic field geologists will allow the science team to make discoveries using a wide range of sensory data collected by electronic 'eyes' and sophisticated scientific instrumentation. rovers need to operate in geologically interesting terrain (rock outcrops) and to identify and closely examine interesting rock samples. Enough flight-ready instruments are available to fly on the first mission, but additional instrument development based on emerging technology is desirable. Various instruments that need to be developed for later missions are described.

  7. Scientific Misconduct.

    ERIC Educational Resources Information Center

    Goodstein, David

    2002-01-01

    Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

  8. The Scientific Competitiveness of Nations

    PubMed Central

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation—that is, the competitiveness of its research system—and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of “markers” of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most “sophisticated” needs of the society. PMID:25493626

  9. The Scientific Competitiveness of Nations.

    PubMed

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society. PMID:25493626

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  12. Application of uncertainty analysis to cooling tower thermal performance tests

    SciTech Connect

    Yost, J.G.; Wheeler, D.E.

    1986-01-01

    The purpose of this paper is to provide an overview of uncertainty analyses. The following topics are addressed: l. A review and summary of the basic constituents of an uncertainty analysis with definitions and discussion of basic terms; 2. A discussion of the benefits and uses of uncertainty analysis; and 3. Example uncertainty analyses with emphasis on the problems, limitations, and site-specific complications.

  13. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  14. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  15. Addressing Social Issues.

    ERIC Educational Resources Information Center

    Schoebel, Susan

    1991-01-01

    Maintains that advertising can help people become more aware of social responsibilities. Describes a successful nationwide newspaper advertising competition for college students in which ads address social issues such as literacy, drugs, teen suicide, and teen pregnancy. Notes how the ads have helped grassroots programs throughout the United…

  16. States Address Achievement Gaps.

    ERIC Educational Resources Information Center

    Christie, Kathy

    2002-01-01

    Summarizes 2 state initiatives to address the achievement gap: North Carolina's report by the Advisory Commission on Raising Achievement and Closing Gaps, containing an 11-point strategy, and Kentucky's legislation putting in place 10 specific processes. The North Carolina report is available at www.dpi.state.nc.us.closingthegap; Kentucky's…

  17. Address of the President

    ERIC Educational Resources Information Center

    Ness, Frederic W.

    1976-01-01

    The president of the Association of American Colleges addresses at the 62nd annual meeting the theme of the conference: "Looking to the Future--Liberal Education in a Radically Changing Society." Contributions to be made by AAC are examined. (LBH)

  18. Addressing Sexual Harassment

    ERIC Educational Resources Information Center

    Young, Ellie L.; Ashbaker, Betty Y.

    2008-01-01

    This article discusses ways on how to address the problem of sexual harassment in schools. Sexual harassment--simply defined as any unwanted and unwelcome sexual behavior--is a sensitive topic. Merely providing students, parents, and staff members with information about the school's sexual harassment policy is insufficient; schools must take…

  19. Space sciences - Keynote address

    NASA Technical Reports Server (NTRS)

    Alexander, Joseph K.

    1990-01-01

    The present status and projected future developments of the NASA Space Science and Applications Program are addressed. Emphasis is given to biochemistry experiments that are planned for the Space Station. Projects for the late 1990s which will study the sun, the earth's magnetosphere, and the geosphere are briefly discussed.

  20. The Peter Shaw Award Acceptance Address: An Immigrant Sociologist

    ERIC Educational Resources Information Center

    Hollander, Paul

    2003-01-01

    This article presents the author's acceptance address for receiving the Peter Shaw award. In this address, the author, an immigrant sociologist, tells how this award helps to resolve questions and uncertainties he has as to the degree to which he can or should consider himself an American--about the extent to which he has become a part, a member…

  1. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  2. Physician uncertainty and the art of persuasion.

    PubMed

    Rizzo, J A

    1993-12-01

    Incomplete information is a chronic feature of medica markets. Much attention has focused on information asymmetries between physicians and their patients. In contrast, physician uncertainty has received far less attention. This is a significant omission. Physician uncertainty may be an even more important reason than consumer uncertainty for the high cost of health care. This paper reviews and evaluates major approaches for managing physician uncertainty. We argue that quantitative approaches alone, such as scientific advancement and the application of decision analysis to clinical reasoning, are insufficient for dealing with uncertainty. Qualitative approaches, such as forging consensus through expert panels, and teaching physicians to accept and cope with uncertainty, will play a valuable role in promoting more effective clinical decision-making under conditions of uncertainty. The current tensions between those who would eradicate physician uncertainty through quantitative approaches and those who favor qualitative methods has parallels in many other fields, including economics and mathematics. These tensions are unfortunate, since the most promising initiative to promote better clinical decision-making will likely need to draw upon both approaches. The recent initiative to implement medical practice guidelines is one example of a broad-based approach to improve clinical decision-making. Guidelines draw upon available scientific evidence, but typically involve consensus-building as well. They seek to persuade and educate physicians about appropriate treatments, without mandating changes in physician treatment patterns. Given the persistent uncertainties physicians will undoubtedly confront regarding appropriate clinical decision-making, this flexible approach may be the best way to mitigate market failures resulting from inappropriate clinical decisions.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8303329

  3. An Ontology for Uncertainty in Climate Change Projections

    NASA Astrophysics Data System (ADS)

    King, A. W.

    2011-12-01

    Paraphrasing Albert Einstein's aphorism about scientific quantification: not all uncertainty that counts can be counted, and not all uncertainty that can be counted counts. The meaning of the term "uncertainty" in climate change science and assessment is itself uncertain. Different disciplines and perspectives bring different nuances if not meanings of the term to the conversation. For many scientists, uncertainty is somehow associated with statistical dispersion and standard error. For many users of climate change information, uncertainty is more related to their confidence, or lack thereof, in climate models. These "uncertainties" may be related, but they are not identical, and there is considerable room for confusion and misunderstanding. A knowledge framework, a system of concepts and vocabulary, for communicating uncertainty can add structure to the characterization and quantification of uncertainty and aid communication among scientists and users. I have developed an ontology for uncertainty in climate change projections derived largely from the report of the W3C Uncertainty Reasoning for the World Wide Web Incubator Group (URW3-XG) dealing with the problem of uncertainty representation and reasoning on the World Wide Web. I have adapted this ontology for uncertainty about information to uncertainty about climate change. Elements of the ontology apply with little or no translation to the information of climate change projections, with climate change almost a use case. Other elements can be translated into language used in climate-change discussions; translating aleatory uncertainty in the UncertaintyNature class as irreducible uncertainty is an example. I have added classes for source of uncertainty (UncertaintySource) (different model physics, for example) and metrics of uncertainty (UncertaintyMetric), at least, in the case of the latter, for those instances of uncertainty that can be quantified (i.e., counted). The statistical standard deviation isa member

  4. Rethinking Uncertainty: What Does the Public Need to Know?

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    The late Steven Schneider is often quoted as addressing the double-bind of science communication: that to be a good scientist one has to be cautious and acknowledge uncertainty, but to reach the media and the public one has to be bold, incautious, and even a bit dramatic. Here, I focus on a related but different double-bind: the double bind of responding to doubt. In our recent book, Merchants of Doubt, Erik M. Conway and I showed how doubt-mongers exploited scientific uncertainty as a political strategy to confuse the public and delay action on a range of environmental issues from the harms of tobacco to the reality of anthropogenic climate change. This strategy is effective because it appeals to lay people, journalists,' and even fellow scientists' sense of fair play—that it is right to hear "both sides" of an issue. Scientists are then caught in a double-bind: refusing to respond seems smug and elitist, but responding scientifically seems to confirm that there is in fact a scientific debate. Doubt-mongering is also hard to counter because our knowledge is, in fact, uncertain, so when we communicate in conventional scientific ways, acknowledging the uncertainties and limits in our understanding, we may end up reinforcing the uncertainty framework. The difficulty is exacerbated by the natural tendency of scientists to focus on novel and original results, rather than matters that are well established, lest we be accused of lacking originality or of taking credit for other's work. The net result is the impression among lay people that our knowledge is very likely to change and therefore a weak basis for making public policy decision. History of science, however, suggests a different picture: we know that a good deal of scientific knowledge has proved temporally robust and has provided a firm basis for effective public policy. Action on earlier environmental issues such as DDT and acid rain, guided by scientific knowledge, has worked to limit environmental damage

  5. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  6. Uncertainty estimation and prediction for interdisciplinary ocean dynamics

    SciTech Connect

    Lermusiaux, Pierre F.J. . E-mail: pierrel@pacific.harvard.edu

    2006-09-01

    Scientific computations for the quantification, estimation and prediction of uncertainties for ocean dynamics are developed and exemplified. Primary characteristics of ocean data, models and uncertainties are reviewed and quantitative data assimilation concepts defined. Challenges involved in realistic data-driven simulations of uncertainties for four-dimensional interdisciplinary ocean processes are emphasized. Equations governing uncertainties in the Bayesian probabilistic sense are summarized. Stochastic forcing formulations are introduced and a new stochastic-deterministic ocean model is presented. The computational methodology and numerical system, Error Subspace Statistical Estimation, that is used for the efficient estimation and prediction of oceanic uncertainties based on these equations is then outlined. Capabilities of the ESSE system are illustrated in three data-assimilative applications: estimation of uncertainties for physical-biogeochemical fields, transfers of ocean physics uncertainties to acoustics, and real-time stochastic ensemble predictions with assimilation of a wide range of data types. Relationships with other modern uncertainty quantification schemes and promising research directions are discussed.

  7. Assessing uncertainty in stormwater quality modelling.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2016-10-15

    Designing effective stormwater pollution mitigation strategies is a challenge in urban stormwater management. This is primarily due to the limited reliability of catchment scale stormwater quality modelling tools. As such, assessing the uncertainty associated with the information generated by stormwater quality models is important for informed decision making. Quantitative assessment of build-up and wash-off process uncertainty, which arises from the variability associated with these processes, is a major concern as typical uncertainty assessment approaches do not adequately account for process uncertainty. The research study undertaken found that the variability of build-up and wash-off processes for different particle size ranges leads to processes uncertainty. After variability and resulting process uncertainties are accurately characterised, they can be incorporated into catchment stormwater quality predictions. Accounting of process uncertainty influences the uncertainty limits associated with predicted stormwater quality. The impact of build-up process uncertainty on stormwater quality predictions is greater than that of wash-off process uncertainty. Accordingly, decision making should facilitate the designing of mitigation strategies which specifically addresses variations in load and composition of pollutants accumulated during dry weather periods. Moreover, the study outcomes found that the influence of process uncertainty is different for stormwater quality predictions corresponding to storm events with different intensity, duration and runoff volume generated. These storm events were also found to be significantly different in terms of the Runoff-Catchment Area ratio. As such, the selection of storm events in the context of designing stormwater pollution mitigation strategies needs to take into consideration not only the storm event characteristics, but also the influence of process uncertainty on stormwater quality predictions. PMID:27423532

  8. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  9. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Gour, Gilad

    2014-03-01

    Uncertainty relations are a distinctive characteristic of quantum theory that imposes intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring non-commuting observables. However, I will show here that there is no fundamental reason for using entropies as quantifiers; in fact, any functional relation that characterizes the uncertainty of the measurement outcomes can be used to define an uncertainty relation. Starting from a simple assumption that any measure of uncertainty is non-decreasing under mere relabeling of the measurement outcomes, I will show that Schur-concave functions are the most general uncertainty quantifiers. I will then introduce a novel fine-grained uncertainty relation written in terms of a majorization relation, which generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary measures of uncertainty. This infinite family of uncertainty relations includes all the known entropic uncertainty relations, but is not limited to them. In this sense, the relation is universally valid and captures the essence of the uncertainty principle in quantum theory. This talk is based on a joint work with Shmuel Friedland and Vlad Gheorghiu. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada and by the Pacific Institute for Mathematical Sciences (PIMS).

  10. Fission Spectrum Related Uncertainties

    SciTech Connect

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  11. Scientific Documentation.

    ERIC Educational Resources Information Center

    Pieper, Gail W.

    1980-01-01

    Describes how scientific documentation is taught in three 50-minute sessions in a technical writing course. Tells how session one distinguishes between in-text notes, footnotes, and reference entries; session two discusses the author-year system of citing references; and session three is concerned with the author-number system of reference…

  12. Scientific Inquiry

    ERIC Educational Resources Information Center

    National Science Teachers Association (NJ1), 2004

    2004-01-01

    Scientific inquiry reflects how scientists come to understand the natural world, and it is at the heart of how students learn. From a very early age, children interact with their environment, ask questions, and seek ways to answer those questions. Understanding science content is significantly enhanced when ideas are anchored to inquiry…

  13. [Scientific presentation].

    PubMed

    Kraft, Giuliano

    2002-01-01

    To give a correct and effective scientific presentation, is an arduous task that asks for close examination of basic techniques of communication. This article proposes indications and suggestions to help public speakers to be communicators, to use visual aids and it explains how to capture the audience attention. PMID:12599721

  14. Carbon Sequestration: Enhanced Evaluation of Uncertainty

    NASA Astrophysics Data System (ADS)

    McNeish, J. A.; Wang, Y.; Dewars, T.; Hadgu, T.; Jove Colon, C. F.; Sun, A.

    2010-12-01

    Carbon capture and sequestration (CCS) is an option to mitigate impacts of atmospheric carbon emission. Initial studies indicate that for long-term geologic storage of carbon to be effective, the leakage rates must be less than 0.1 - 0.01%/yr. Recent efforts have been made to apply the existing probabilistic performance assessment (PA) methodology developed for deep nuclear waste geologic repositories to evaluate the effectiveness of subsurface carbon storage. However, to address the most pressing management, regulatory, and scientific concerns with subsurface carbon storage (CS), the existing PA methodology and tools must be enhanced and upgraded. For example, in the evaluation of a nuclear waste repository, a PA model is essentially a forward model that samples input parameters and runs multiple realizations to estimate future consequences and determine important parameters driving the system performance. In the CS evaluation, however, a PA model must be able to run both forward and inverse calculations to support real-time site monitoring as an integral part of the design and operational phases. The monitoring data must be continually fused into the PA model through model inversion and parameter estimation. Model calculations will in turn guide the design of optimal monitoring and carbon-injection strategies (e.g., in terms of monitoring techniques, locations, and time intervals). This study formulates the advanced PA concept for CS systems and establishes a prototype PA framework for the concept. The new PA framework includes a built-in optimization capability for model parameterization and monitoring system design. The capabilities of this framework will be demonstrated with a hypothetical CS system. The work lays the foundation for the development of a new generation of PA tools for effective management of CS activities. The work supports energy security and climate change/adaptation by furthering the capability to effectively manage proposed carbon capture

  15. Excerpts from keynote address

    SciTech Connect

    Creel, G.C.

    1995-06-01

    Excerpts from the keynote principally address emissions issues in the fossil power industry as related to heat rate improvements. Stack emissions of both sulfur and nitrogen oxides are discussed, and a number of examples are given: (1) PEPCO`s Potomac River Station, and (2) Morgantown station`s NOX reduction efforts. Circulating water emissions are also briefly discussed, as are O & M costs of emission controls.

  16. Holographic content addressable storage

    NASA Astrophysics Data System (ADS)

    Chao, Tien-Hsin; Lu, Thomas; Reyes, George

    2015-03-01

    We have developed a Holographic Content Addressable Storage (HCAS) architecture. The HCAS systems consists of a DMD (Digital Micromirror Array) as the input Spatial Light Modulator (SLM), a CMOS (Complementary Metal-oxide Semiconductor) sensor as the output photodetector and a photorefractive crystal as the recording media. The HCAS system is capable of performing optical correlation of an input image/feature against massive reference data set stored in the holographic memory. Detailed system analysis will be reported in this paper.

  17. Scientific Software Component Technology

    SciTech Connect

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  18. Pore Velocity Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Devary, J. L.; Doctor, P. G.

    1982-08-01

    Geostatistical data analysis techniques were used to stochastically model the spatial variability of groundwater pore velocity in a potential waste repository site. Kriging algorithms were applied to Hanford Reservation data to estimate hydraulic conductivities, hydraulic head gradients, and pore velocities. A first-order Taylor series expansion for pore velocity was used to statistically combine hydraulic conductivity, hydraulic head gradient, and effective porosity surfaces and uncertainties to characterize the pore velocity uncertainty. Use of these techniques permits the estimation of pore velocity uncertainties when pore velocity measurements do not exist. Large pore velocity estimation uncertainties were found to be located in the region where the hydraulic head gradient relative uncertainty was maximal.

  19. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  20. Scientific Claims versus Scientific Knowledge.

    ERIC Educational Resources Information Center

    Ramsey, John

    1991-01-01

    Provides activities that help students to understand the importance of the scientific method. The activities include the science of fusion and cold fusion; a group activity that analyzes and interprets the events surrounding cold fusion; and an application research project concerning a current science issue. (ZWH)

  1. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, J. Storrs; Levy, Saul; Smith, Donald E.; Miyake, Keith M.

    1992-01-01

    A parameterized version of the tree processor was designed and tested (by simulation). The leaf processor design is 90 percent complete. We expect to complete and test a combination of tree and leaf cell designs in the next period. Work is proceeding on algorithms for the computer aided manufacturing (CAM), and once the design is complete we will begin simulating algorithms for large problems. The following topics are covered: (1) the practical implementation of content addressable memory; (2) design of a LEAF cell for the Rutgers CAM architecture; (3) a circuit design tool user's manual; and (4) design and analysis of efficient hierarchical interconnection networks.

  2. Bioreactors Addressing Diabetes Mellitus

    PubMed Central

    Minteer, Danielle M.; Gerlach, Jorg C.

    2014-01-01

    The concept of bioreactors in biochemical engineering is a well-established process; however, the idea of applying bioreactor technology to biomedical and tissue engineering issues is relatively novel and has been rapidly accepted as a culture model. Tissue engineers have developed and adapted various types of bioreactors in which to culture many different cell types and therapies addressing several diseases, including diabetes mellitus types 1 and 2. With a rising world of bioreactor development and an ever increasing diagnosis rate of diabetes, this review aims to highlight bioreactor history and emerging bioreactor technologies used for diabetes-related cell culture and therapies. PMID:25160666

  3. Scientific Misconduct

    NASA Astrophysics Data System (ADS)

    Moore, John W.

    2002-12-01

    These cases provide a good basis for discussions of scientific ethics, particularly with respect to the responsibilities of colleagues in collaborative projects. With increasing numbers of students working in cooperative or collaborative groups, there may be opportunities for more than just discussion—similar issues of responsibility apply to the members of such groups. Further, this is an area where, “no clear, widely accepted standards of behavior exist” (1). Thus there is an opportunity to point out to students that scientific ethics, like science itself, is incomplete and needs constant attention to issues that result from new paradigms such as collaborative research. Finally, each of us can resolve to pay more attention to the contributions we and our colleagues make to collaborative projects, applying to our own work no less critical an eye than we would cast on the work of those we don’t know at all.

  4. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that

  5. Addressing Environmental Health Inequalities.

    PubMed

    Gouveia, Nelson

    2016-01-01

    Environmental health inequalities refer to health hazards disproportionately or unfairly distributed among the most vulnerable social groups, which are generally the most discriminated, poor populations and minorities affected by environmental risks. Although it has been known for a long time that health and disease are socially determined, only recently has this idea been incorporated into the conceptual and practical framework for the formulation of policies and strategies regarding health. In this Special Issue of the International Journal of Environmental Research and Public Health (IJERPH), "Addressing Environmental Health Inequalities-Proceedings from the ISEE Conference 2015", we incorporate nine papers that were presented at the 27th Conference of the International Society for Environmental Epidemiology (ISEE), held in Sao Paulo, Brazil, in 2015. This small collection of articles provides a brief overview of the different aspects of this topic. Addressing environmental health inequalities is important for the transformation of our reality and for changing the actual development model towards more just, democratic, and sustainable societies driven by another form of relationship between nature, economy, science, and politics. PMID:27618906

  6. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  7. Uncertainty, conflict and consent: revisiting the futility debate in neurotrauma.

    PubMed

    Honeybul, Stephen; Gillett, Grant R; Ho, Kwok M

    2016-07-01

    The concept of futility has been debated for many years, and a precise definition remains elusive. This is not entirely unsurprising given the increasingly complex and evolving nature of modern medicine. Progressively more complex decisions are required when considering increasingly sophisticated diagnostic and therapeutic interventions. Allocating resources appropriately amongst a population whose expectations continue to increase raises a number of ethical issues not least of which are the difficulties encountered when consideration is being given to withholding "life-preserving" treatment. In this discussion we have used decompressive craniectomy for severe traumatic brain injury as a clinical example with which to frame an approach to the concept. We have defined those issues that initially lead us to consider futility and thereafter actually provoke a significant discussion. We contend that these issues are uncertainty, conflict and consent. We then examine recent scientific advances in outcome prediction that may address some of the uncertainty and perhaps help achieve consensus amongst stakeholders. Whilst we do not anticipate that this re-framing of the idea of futility is applicable to all medical situations, the approach to specify patient-centred benefit may assist those making such decisions when patients are incompetent to participate. PMID:27143027

  8. Scientific Component Technology Initiative

    SciTech Connect

    Kohn, S; Bosl, B; Dahlgren, T; Kumfert, G; Smith, S

    2003-02-07

    The laboratory has invested a significant amount of resources towards the development of high-performance scientific simulation software, including numerical libraries, visualization, steering, software frameworks, and physics packages. Unfortunately, because this software was not designed for interoperability and re-use, it is often difficult to share these sophisticated software packages among applications due to differences in implementation language, programming style, or calling interfaces. This LDRD Strategic Initiative investigated and developed software component technology for high-performance parallel scientific computing to address problems of complexity, re-use, and interoperability for laboratory software. Component technology is an extension of scripting and object-oriented software development techniques that specifically focuses on the needs of software interoperability. Component approaches based on CORBA, COM, and Java technologies are widely used in industry; however, they do not support massively parallel applications in science and engineering. Our research focused on the unique requirements of scientific computing on ASCI-class machines, such as fast in-process connections among components, language interoperability for scientific languages, and data distribution support for massively parallel SPMD components.

  9. Uncertainty Analysis of Model Coupling

    NASA Astrophysics Data System (ADS)

    Held, H.; Knopf, B.; Schneider von Deimling, T.; Schellnhuber, H.-J.

    The Earth System is a highly complex system that is often modelled by coupling sev- eral nonlinear submodules. For predicting the climate with these models, the following uncertainties play an essential role: parameter uncertainty, uncertainty in initial con- ditions or model uncertainty. Here we will address uncertainty in initial conditions as well as model uncertainty. As the process of coupling is an important part of model- ing, the main aspect of this work is the investigation of uncertainties that are due to the coupling process. For this study we use conceptual models that, compared to GCMs, have the advantage that the model itself as well as the output can be treated in a mathematically elabo- rated way. As the time for running the model is much shorter, the investigation is also possible for a longer period, e.g. for paleo runs. In consideration of these facts it is feasible to analyse the whole phase space of the model. The process of coupling is investigated by using different methods of examining low order coupled atmosphere-ocean systems. In the dynamical approach a fully coupled system of the two submodules can be compared to a system where one submodule forces the other. For a particular atmosphere-ocean system, based on the Lorenz model for the atmosphere, there can be shown significant differences in the predictability of a forced system depending whether the subsystems are coupled in a linear or a non- linear way. In [1] it is shown that in the linear case the forcing cannot represent the coupling, but in the nonlinear case, that we investigated in our study, the variability and the statistics of the coupled system can be reproduced by the forcing. Another approach to analyse the coupling is to carry out a bifurcation analysis. Here the bifurcation diagram of a single atmosphere system is compared to that of a cou- pled atmosphere-ocean system. Again it can be seen from the different behaviour of the coupled and the uncoupled system, that the

  10. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, Josh; Levy, Saul; Smith, D.; Wei, S.; Miyake, K.; Murdocca, M.

    1991-01-01

    The progress on the Rutgers CAM (Content Addressable Memory) Project is described. The overall design of the system is completed at the architectural level and described. The machine is composed of two kinds of cells: (1) the CAM cells which include both memory and processor, and support local processing within each cell; and (2) the tree cells, which have smaller instruction set, and provide global processing over the CAM cells. A parameterized design of the basic CAM cell is completed. Progress was made on the final specification of the CPS. The machine architecture was driven by the design of algorithms whose requirements are reflected in the resulted instruction set(s). A few of these algorithms are described.

  11. Bax: Addressed to kill.

    PubMed

    Renault, Thibaud T; Manon, Stéphen

    2011-09-01

    The pro-apoptototic protein Bax (Bcl-2 Associated protein X) plays a central role in the mitochondria-dependent apoptotic pathway. In healthy mammalian cells, Bax is essentially cytosolic and inactive. Following a death signal, the protein is translocated to the outer mitochondrial membrane, where it promotes a permeabilization that favors the release of different apoptogenic factors, such as cytochrome c. The regulation of Bax translocation is associated to conformational changes that are under the control of different factors. The evidences showing the involvement of different Bax domains in its mitochondrial localization are presented. The interactions between Bax and its different partners are described in relation to their ability to promote (or prevent) Bax conformational changes leading to mitochondrial addressing and to the acquisition of the capacity to permeabilize the outer mitochondrial membrane. PMID:21641962

  12. PHIGS PLUS for scientific graphics

    SciTech Connect

    Crawfis, R.A.

    1991-01-14

    This paper gives a brief overview of the use of computer graphics standards in the scientific community. It particularly details how how PHIGS PLUS meets the needs of users at the Lawrence Livermore National Laboratory. Although standards for computer graphics have improved substantially over the past decade, their acceptance in the scientific community has been slow. As the use and diversity of computers has increased, the scientific graphics libraries have not been able to keep pace with the additional capabilities these new machines offer. Therefore, several organizations have or are now working on converting their scientific libraries to reset upon a portable standard. This paper will address why is transition has been so slow and offer suggestions for future standards work to enhance scientific visualization. This work was performed under the auspices of the US Department of Energy by Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  13. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  14. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-04-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, including for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  15. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  16. Uncertainty of decibel levels.

    PubMed

    Taraldsen, Gunnar; Berge, Truls; Haukland, Frode; Lindqvist, Bo Henry; Jonasson, Hans

    2015-09-01

    The mean sound exposure level from a source is routinely estimated by the mean of the observed sound exposures from repeated measurements. A formula for the standard uncertainty based on the Guide to the expression of Uncertainty in Measurement (GUM) is derived. An alternative formula is derived for the case where the GUM method fails. The formulas are applied on several examples, and compared with a Monte Carlo calculation of the standard uncertainty. The recommended formula can be seen simply as a convenient translation of the uncertainty on an energy scale into the decibel level scale, but with a theoretical foundation. PMID:26428824

  17. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  18. Assessing what to address in science communication.

    PubMed

    Bruine de Bruin, Wändi; Bostrom, Ann

    2013-08-20

    As members of a democratic society, individuals face complex decisions about whether to support climate change mitigation, vaccinations, genetically modified food, nanotechnology, geoengineering, and so on. To inform people's decisions and public debate, scientific experts at government agencies, nongovernmental organizations, and other organizations aim to provide understandable and scientifically accurate communication materials. Such communications aim to improve people's understanding of the decision-relevant issues, and if needed, promote behavior change. Unfortunately, existing communications sometimes fail when scientific experts lack information about what people need to know to make more informed decisions or what wording people use to describe relevant concepts. We provide an introduction for scientific experts about how to use mental models research with intended audience members to inform their communication efforts. Specifically, we describe how to conduct interviews to characterize people's decision-relevant beliefs or mental models of the topic under consideration, identify gaps and misconceptions in their knowledge, and reveal their preferred wording. We also describe methods for designing follow-up surveys with larger samples to examine the prevalence of beliefs as well as the relationships of beliefs with behaviors. Finally, we discuss how findings from these interviews and surveys can be used to design communications that effectively address gaps and misconceptions in people's mental models in wording that they understand. We present applications to different scientific domains, showing that this approach leads to communications that improve recipients' understanding and ability to make informed decisions. PMID:23942122

  19. Assessing what to address in science communication

    PubMed Central

    Bruine de Bruin, Wändi; Bostrom, Ann

    2013-01-01

    As members of a democratic society, individuals face complex decisions about whether to support climate change mitigation, vaccinations, genetically modified food, nanotechnology, geoengineering, and so on. To inform people’s decisions and public debate, scientific experts at government agencies, nongovernmental organizations, and other organizations aim to provide understandable and scientifically accurate communication materials. Such communications aim to improve people’s understanding of the decision-relevant issues, and if needed, promote behavior change. Unfortunately, existing communications sometimes fail when scientific experts lack information about what people need to know to make more informed decisions or what wording people use to describe relevant concepts. We provide an introduction for scientific experts about how to use mental models research with intended audience members to inform their communication efforts. Specifically, we describe how to conduct interviews to characterize people’s decision-relevant beliefs or mental models of the topic under consideration, identify gaps and misconceptions in their knowledge, and reveal their preferred wording. We also describe methods for designing follow-up surveys with larger samples to examine the prevalence of beliefs as well as the relationships of beliefs with behaviors. Finally, we discuss how findings from these interviews and surveys can be used to design communications that effectively address gaps and misconceptions in people’s mental models in wording that they understand. We present applications to different scientific domains, showing that this approach leads to communications that improve recipients’ understanding and ability to make informed decisions. PMID:23942122

  20. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2011-10-06

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  1. Demonstrating the value of medicines: evolution of value equation and stakeholder perception of uncertainties.

    PubMed

    Narayanan, Siva

    2016-01-01

    It is important to evaluate how the value of medicine is assessed, as it may have important implications for health technology and reimbursement assessments. The value equation could comprise 'incremental benefit/outcome' (relative results of care in terms of patient health, comparing the innovation to best available alternative(s)) in the numerator and 'cost' (relative costs involved in the full cycle of care (or a defined period) for the patient's medical condition, incorporating the relevant cost-offsets due to displacement of best available alternative(s)) in the denominator. This 'relative value' combined with the overall net budget impact (of including the drug in the formulary or reimbursed drug list) at the concerned population level in the given institution/region/country may better inform the usefulness of the new therapeutic option to the healthcare system. As product value messages are created, anticipating external stakeholder questions and information needs, including addressing three main categories of 'uncertainties', namely the scientific uncertainties, usage uncertainties, and financial uncertainties, could facilitate demonstration of optimal product value and help informed decision-making to benefit all stakeholders involved in the process. PMID:27489585

  2. Icarus`s discovery: Acting on global climate change in the face of uncertainty

    SciTech Connect

    Brooks, D.G.; Maracas, K.B.; Hayslip, R.M.

    1994-12-31

    The mythological character Icarus had the misfortune of learning the consequences of his decision to fly too near the sun at the same time he employed his decision. Although Daedalus tried to reduce the uncertainties of his son`s decision by warning Icarus of the possible outcome, Icarus had no empirical knowledge of what would actually happen until his waxen wings melted and he fell to the sea. Like Icarus, man has no empirical knowledge or conclusive evidence today of the possible effects of global climate change. And though the consequences of policy decisions toward global climate change may not be as catastrophic as falling into the sea, the social and economic impacts of those decisions will be substantial. There are broad uncertainties related to the scientific and ecological aspects of global climate change. But clearly the ``politics`` of global climate change issues are moving at a faster rate than the science. There is a public outcry for action now, in the face of uncertainty. This paper profiles a case study of a southwestern utility`s use of multi-attribute preference theory to reduce uncertainties and analyze its options for addressing global climate change issues.

  3. Magnetic content addressable memories

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenye

    Content Addressable Memories are designed with comparison circuits built into every bit cell. This parallel structure can increase the speed of searching from O(n) (as with Random Access Memories) to O(1), where n is the number of entries being searched. The high cost in hardware limits the application of CAM within situations where higher searching speed is extremely desired. Spintronics technology can build non-volatile Magnetic RAM with only one device for one bit cell. There are various technologies involved, like Magnetic Tunnel Junctions, off-easy-axis programming method, Synthetic Anti-Ferromagnetic tri-layers, Domain Wall displacement, Spin Transfer Torque tri-layers and etc. With them, particularly the Tunnel Magneto-Resistance variation in MTJ due to difference in magnetization polarity of the two magnets, Magnetic CAM can be developed with reduced hardware cost. And this is demonstrated by the discussion in this dissertation. Six MCAM designs are discussed. In the first design, comparand (C), local information (S) and their complements are stored into 4 MTJs connected in XOR gate pattern. The other five designs have one or two stacks for both information storage and comparison, and full TMR ratio can be taken advantage of. Two challenges for the five are specifically programming C without changing S and selectively programming a cell out of an array. The solutions to specific programming are: by confining the programming field for C in a ring structure design; by using field programming and spin polarized current programming respectively for C and S in the SAF+DW and SAF+STT tri-layer design; by making use of the difference in thresholds between direct mode and toggle mode switching in the SAF+SAF design. The problem of selective programming is addressed by off-easy-axis method and by including SAF tri-layers. Cell with STT tri-layers for both C and S can completely avoid the problems of specific and selective programming, but subject to the limit of

  4. A review of uncertainty research in impact assessment

    SciTech Connect

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  5. Because Doubt Is A Sure Thing: Incorporating Uncertainty Characterization Into Climate Change Decision-Making

    NASA Astrophysics Data System (ADS)

    Moss, R.; Rice, J.; Scott, M. J.; Unwin, S.; Whitney, P.

    2012-12-01

    This presentation describes the results of new research to develop a stakeholder-driven uncertainty characterization (UC) process to help address the challenges of regional climate change mitigation and adaptation decisions. Integrated regional Earth system models are a promising approach for modeling how climate change may affect natural resources, infrastructure, and socioeconomic conditions at regional scales, and how different adaptation and mitigation strategies may interact. However, the inherent complexity, long run-times, and large numbers of uncertainties in coupled regional human-environment systems render standard, model-driven approaches for uncertainty characterization infeasible. This new research focuses on characterizing stakeholder decision support needs as part of an overall process to identify the key uncertainties relevant for the application in question. The stakeholder-driven process reduces the dimensionality of the uncertainty modeling challenge while providing robust insights for science and decision-making. This research is being carried out as part of the integrated Regional Earth System Model (iRESM) initiative, a new scientific framework developed at Pacific Northwest National Laboratory to evaluate the interactions between human and environmental systems and mitigation and adaptation decisions at regional scales. The framework provides a flexible architecture for model couplings between a regional Earth system model, a regional integrated assessment model, and highly spatially resolved models of crop productivity, building energy demands, electricity infrastructure operation and expansion, and water supply and management. In an example of applying the stakeholder-driven UC process, the presentation first identifies stakeholder decision criteria for a particular regional mitigation or adaptation question. These criteria are used in conjunction with the flexible architecture to determine the relevant component models for coupling and the

  6. Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2008-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1

  7. Scientific Data Management Center Scientific Data Integration

    SciTech Connect

    Critchlow, T J; Liu, L; Pu, C; Gupta, A; Ludaescher, B; Altintas, I; Vouk, M; Bitzer, D; Singh, M; Rosnick, D

    2003-01-31

    The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This has resulted in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and use those sources almost exclusively. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information. We are developing an end-to-end solution using leading-edge automatic wrapper generation, mediated query, and agent technology that will allow scientists to interact with more information sources than currently possible. Furthermore, by taking a workflow-based approach to this problem, we allow them to easily adjust the dataflow between the various sources to address their specific research needs.

  8. Balancing Certainty and Uncertainty in Clinical Practice

    ERIC Educational Resources Information Center

    Kamhi, Alan G.

    2011-01-01

    Purpose: In this epilogue, I respond to each of the five commentaries, discussing in some depth a central issue raised in each commentary. In the final section, I discuss how my thinking about certainty and uncertainty in clinical practice has evolved since I wrote the initial article. Method: Topics addressed include the similarities/differences…

  9. Gaps in scientific knowledge about the carcinogenic potential of asphalt/bitumen fumes.

    PubMed

    Schulte, Paul A

    2007-01-01

    Despite a relatively large body of published research, the potential carcinogenicity of asphalt/bitumen fumes is still a vexing question. Various uncertainties and gaps in scientific knowledge need to be addressed. These include uncertainties in chemistry, animal studies, and human studies. The chemistry of asphalt/bitumen fumes is complex and varies according to the source of the crude oil and the application parameters. The epidemiological studies, while showing weak evidence of lung cancer, are inconsistent and many confounding factors have not been addressed. Studies of animal exposure are also inconsistent regarding laboratory and field-generated fumes. There is a need for further human studies that address potential confounding factors such as smoking, diet, coal tar, and diesel exposures. Animal inhalation studies need to be conducted with asphalt/bitumen fumes that are chemically representative of roofing and paving fumes. Underlying all of this is the need for continued characterization of fumes so their use in animal and field studies can be properly assessed. Nonetheless, uncertainties such as these should not preclude appropriate public health actions to protect workers in the even that asphalt fumes are found to be a carcinogenic hazard. PMID:17503268

  10. Climate change, uncertainty, and natural resource management

    USGS Publications Warehouse

    Nichols, J.D.; Koneff, M.D.; Heglund, P.J.; Knutson, M.G.; Seamans, M.E.; Lyons, J.E.; Morton, J.M.; Jones, M.T.; Boomer, G.S.; Williams, B.K.

    2011-01-01

    Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

  11. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  12. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  13. Adapting to Uncertainty: Comparing Methodological Approaches to Climate Adaptation and Mitigation Policy

    NASA Astrophysics Data System (ADS)

    Huda, J.; Kauneckis, D. L.

    2013-12-01

    Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.

  14. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  15. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  16. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  17. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  18. Economic uncertainty and econophysics

    NASA Astrophysics Data System (ADS)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  19. Managing the Future: Public Policy, Scientific Uncertainty, and Global Warming.

    ERIC Educational Resources Information Center

    Jamieson, Dale

    Due to the injection of carbon dioxide and various other gasses into the atmosphere, the world of the 21st century may well have a climate that is beyond the parameters of human existence. Physical science produces information regarding the physical effects of increasing concentrations of "greenhouse" gasses. Once this information is developed, it…

  20. Uncertainty in hydrological signatures for gauged and ungauged catchments

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  1. PIV uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard

    2016-08-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5–10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.

  2. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  3. Exploring Cloud Computing for Large-scale Scientific Applications

    SciTech Connect

    Lin, Guang; Han, Binh; Yin, Jian; Gorton, Ian

    2013-06-27

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address these challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.

  4. Estimating the magnitude of prediction uncertainties for the APLE model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  5. The challenges on uncertainty analysis for pebble bed HTGR

    SciTech Connect

    Hao, C.; Li, F.; Zhang, H.

    2012-07-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  6. Assessing MODIS Macrophysical Cloud Property Uncertainties

    NASA Astrophysics Data System (ADS)

    Maddux, B. C.; Ackerman, S. A.; Frey, R.; Holz, R.

    2013-12-01

    Cloud, being multifarious and ephemeral, is difficult to observe and quantify in a systematic way. Even basic terminology used to describe cloud observations is fraught with ambiguity in the scientific literature. Any observational technique, method, or platform will contain inherent and unavoidable measurement uncertainties. Quantifying these uncertainties in cloud observations is a complex task that requires an understanding of all aspects of the measurement. We will use cloud observations obtained from the Moderate Resolution Imaging Spectroradiameter(MODIS) to obtain metrics of the uncertainty of its cloud observations. Our uncertainty analyses will contain two main components, 1) an attempt to create a bias or uncertainty with respect to active measurements from CALIPSO and 2) a relative uncertainty within the MODIS cloud climatologies themselves. Our method will link uncertainty to the physical observation and its environmental/scene characteristics. Our aim is to create statistical uncertainties that are based on the cloud observational values, satellite view geometry, surface type, etc, for cloud amount and cloud top pressure. The MODIS instruments on the NASA Terra and Aqua satellites provide observations over a broad spectral range (36 bands between 0.415 and 14.235 micron) and high spatial resolution (250 m for two bands, 500 m for five bands, 1000 m for 29 bands), which the MODIS cloud mask algorithm (MOD35) utilizes to provide clear/cloud determinations over a wide array of surface types, solar illuminations and view geometries. For this study we use the standard MODIS products, MOD03, MOD06 and MOD35, all of which were obtained from the NASA Level 1 and Atmosphere Archive and Distribution System.

  7. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  8. Confronting Uncertainty in Wildlife Management: Performance of Grizzly Bear Management

    PubMed Central

    Artelle, Kyle A.; Anderson, Sean C.; Cooper, Andrew B.; Paquet, Paul C.; Reynolds, John D.; Darimont, Chris T.

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone – discrepancy between expected and realized mortality levels – led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty. PMID:24223134

  9. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    PubMed

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty. PMID:24223134

  10. Estimating uncertainty of streamflow simulation using Bayesian neural networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xuesong; Liang, Faming; Srinivasan, Raghavan; van Liew, Michael

    2009-02-01

    Recent studies have shown that Bayesian neural networks (BNNs) are powerful tools for providing reliable hydrologic prediction and quantifying the prediction uncertainty. The reasonable estimation of the prediction uncertainty, a valuable tool for decision making to address water resources management and design problems, is influenced by the techniques used to deal with different uncertainty sources. In this study, four types of BNNs with different treatments of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty estimation of streamflow simulation in two U.S. Department of Agriculture Agricultural Research Service watersheds (Little River Experimental Watershed in Georgia and Reynolds Creek Experimental Watershed in Idaho). An advanced Markov chain Monte Carlo algorithm, evolutionary Monte Carlo, was used to train the BNNs and to estimate uncertainty limits of streamflow simulation. The results obtained in these two case study watersheds show that the 95% uncertainty limits estimated by different types of BNNs are different from each other. The BNNs that only consider the parameter uncertainty with noninformative prior knowledge contain the least number of observed streamflow data in their 95% uncertainty bound. By considering variable model structure and informative prior knowledge, the BNNs can provide more reasonable quantification of the uncertainty of streamflow simulation. This study stresses the need for improving understanding and quantifying methods of different uncertainty sources for effective estimation of uncertainty of hydrologic simulation using BNNs.

  11. The legal status of uncertainty

    NASA Astrophysics Data System (ADS)

    Ferraris, L.; Miozzo, D.

    2009-09-01

    Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We are, in fact, presently experiencing a detrimental increase of legal actions taken against the authorities of civil protection whom, relying on the forecasts of mathematical models, fail in protecting the population. It is our profound concern that civilians have granted the right of being protected by any means, and at the same extent, from natural hazards and from the fallacious behaviour of whom should grant individual safety. But, at the same time, a dangerous overcriminalization could have a negative impact on the Civil Protection system inducing a dangerous defensive behaviour which is costly and ineffective. A few case studies are presented in which the role of uncertainty, in numerical predictions, is made evident and discussed. Scientists, thus, need to help policymakers to agree on sound procedures that must recognize the real level of unpredictability. Hence, we suggest the creation of an international and interdisciplinary committee, with the scope of having politics, jurisprudence and science communicate, to find common solutions to a common problem.

  12. EPA scientific integrity policy draft

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-08-01

    The U.S. Environmental Protection Agency (EPA) issued its draft scientific integrity policy on 5 August. The draft policy addresses scientific ethical standards, communications with the public, the use of advisory committees and peer review, and professional development. The draft policy was developed by an ad hoc group of EPA senior staff and scientists in response to a December 2010 memorandum on scientific integrity from the White House Office of Science and Technology Policy. The agency is accepting public comments on the draft through 6 September; comments should be sent to osa.staff@epa.gov. For more information, see http://www.epa.gov/stpc/pdfs/draft-scientific-integrity-policy-aug2011.pdf.

  13. The cascade of uncertainty in modeling the impacts of climate change on Europe's forests

    NASA Astrophysics Data System (ADS)

    Reyer, Christopher; Lasch-Born, Petra; Suckow, Felicitas; Gutsch, Martin

    2015-04-01

    Projecting the impacts of global change on forest ecosystems is a cornerstone for designing sustainable forest management strategies and paramount for assessing the potential of Europe's forest to contribute to the EU bioeconomy. Research on climate change impacts on forests relies to a large extent on model applications along a model chain from Integrated Assessment Models to General and Regional Circulation Models that provide important driving variables for forest models. Or to decision support systems that synthesize findings of more detailed forest models to inform forest managers. At each step in the model chain, model-specific uncertainties about, amongst others, parameter values, input data or model structure accumulate, leading to a cascade of uncertainty. For example, climate change impacts on forests strongly depend on the in- or exclusion of CO2-effects or on the use of an ensemble of climate models rather than relying on one particular climate model. In the past, these uncertainties have not or only partly been considered in studies of climate change impacts on forests. This has left managers and decision-makers in doubt of how robust the projected impacts on forest ecosystems are. We deal with this cascade of uncertainty in a structured way and the objective of this presentation is to assess how different types of uncertainties affect projections of the effects of climate change on forest ecosystems. To address this objective we synthesized a large body of scientific literature on modeled productivity changes and the effects of extreme events on plant processes. Furthermore, we apply the process-based forest growth model 4C to forest stands all over Europe and assess how different climate models, emission scenarios and assumptions about the parameters and structure of 4C affect the uncertainty of the model projections. We show that there are consistent regional changes in forest productivity such as an increase in NPP in cold and wet regions while

  14. Cascading rainfall uncertainties into 2D inundation impact models

    NASA Astrophysics Data System (ADS)

    Souvignet, Maxime; de Almeida, Gustavo; Champion, Adrian; Garcia Pintado, Javier; Neal, Jeff; Freer, Jim; Cloke, Hannah; Odoni, Nick; Coxon, Gemma; Bates, Paul; Mason, David

    2013-04-01

    Existing precipitation products show differences in their spatial and temporal distribution and several studies have presented how these differences influence the ability to predict hydrological responses. However, an atmospheric-hydrologic-hydraulic uncertainty cascade is seldom explored and how, importantly, input uncertainties propagate through this cascade is still poorly understood. Such a project requires a combination of modelling capabilities, runoff generation predictions based on those rainfall forecasts, and hydraulic flood wave propagation based on the runoff predictions. Accounting for uncertainty in each component is important in decision making for issuing flood warnings, monitoring or planning. We suggest a better understanding of uncertainties in inundation impact modelling must consider these differences in rainfall products. This will improve our understanding of the input uncertainties on our predictive capability. In this paper, we propose to address this issue by i) exploring the effects of errors in rainfall on inundation predictive capacity within an uncertainty framework, i.e. testing inundation uncertainty against different comparable meteorological conditions (i.e. using different rainfall products). Our method cascades rainfall uncertainties into a lumped hydrologic model (FUSE) within the GLUE uncertainty framework. The resultant prediction uncertainties in discharge provide uncertain boundary conditions, which are cascaded into a simplified shallow water 2D hydraulic model (LISFLOOD-FP). Rainfall data captured by three different measurement techniques - rain gauges, gridded data and numerical weather predictions (NWP) models are used to assess the combined input data and model parameter uncertainty. The study is performed in the Severn catchment over the period between June and July 2007, where a series of rainfall events causing record floods in the study area). Changes in flood area extent are compared and the uncertainty envelope is

  15. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  16. 78 FR 64211 - FIFRA Scientific Advisory Panel; Notice of Cancellation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... Federal Register on August 9, 2013 (78 FR 48672, FRL-9394-3). This meeting will be rescheduled in the near... review scientific uncertainties associated with corn rootworm resistance monitoring for Bt corn...

  17. Conundrums with uncertainty factors.

    PubMed

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767

  18. Science, Uncertainty, and Adaptive Management in Large River Restoration Programs: Trinity River example

    NASA Astrophysics Data System (ADS)

    McBain, S.

    2002-12-01

    Following construction of Trinity and Lewiston dams on the upper Trinity River in 1964, dam induced changes to streamflows and sediment regime had severely simplified channel morphology and aquatic habitat downstream of the dams. This habitat change, combined with blocked access to over 100 miles of salmon and steelhead habitat upstream of the dams, caused salmon and steelhead populations to quickly plummet. An instream flow study was initiated in 1984 to address the flow needs to restore the fishery, and this study relied on the Physical Habitat Simulation (PHABSIM) Model to quantify instream flow needs. In 1992, geomorphic and riparian studies were integrated into the instream flow study, with the overall study completed in 1999 (USFWS 1999). This 13-year process continued through three presidential administrations, several agency managers, and many turnovers of the agency technical staff responsible for conducting the study. This process culminated in 1996-1998 when a group of scientists were convened to integrate all the studies and data to produce the final instream flow study document. This 13-year, non-linear process, resulted in many uncertainties that could not be resolved in the short amount of time allowed for completing the instream flow study document. Shortly after completion of the instream flow study document, the Secretary of Interior issued a Record of Decision to implement the recommendations contained in the instream flow study document. The uncertainties encountered as the instream flow study report was prepared were highlighted in the report, and the Record of Decision initiated an Adaptive Environmental Assessment and Management program to address these existing uncertainties and improve future river management. There have been many lessons learned going through this process, and the presentation will summarize: 1)The progression of science used to develop the instream flow study report; 2)How the scientists preparing the report addressed

  19. The Journalism of Uncertainty.

    ERIC Educational Resources Information Center

    Patterson, Joye

    1979-01-01

    Science journalism is in a period of change from its prior position of reporting the pronouncements of scientists to one of challenging the conclusions of scientists and using multiple sources to comment on scientific discovery. It is necessary that educational institutions anticipate the need for competent scientific journalists. (RE)

  20. Physics and Operational Research: measure of uncertainty via Nonlinear Programming

    NASA Astrophysics Data System (ADS)

    Davizon-Castillo, Yasser A.

    2008-03-01

    Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

  1. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-20

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. Finally, we illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  2. 2014 ASHG Awards and Addresses

    PubMed Central

    2015-01-01

    Each year at the annual meeting of The American Society of Human Genetics (ASHG), addresses are given in honor of The Society and a number of award winners. A summary of each of these addresses is given below. On the following pages, we have printed the presidential address and the addresses for the William Allan Award, the Curt Stern Award, and the Victor A. McKusick Leadership Award. Webcasts of these addresses, as well as those of many other presentations, can be found at http://www.ashg.org.

  3. 2013 ASHG Awards and Addresses

    PubMed Central

    2014-01-01

    Each year at the annual meeting of The American Society of Human Genetics (ASHG), addresses are given in honor of The Society and a number of award winners. A summary of each of these addresses is given below. On the following pages, we have printed the Presidential Address and the addresses for the William Allan Award, the Curt Stern Award, and the Victor A. McKusick Leadership Award. Webcasts of these addresses, as well as those of many other presentations, can be found at http://www.ashg.org.

  4. Uncertainty of Pyrometers in a Casting Facility

    SciTech Connect

    Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.

    2001-12-07

    This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.

  5. Classification images with uncertainty

    PubMed Central

    Tjan, Bosco S.; Nandy, Anirvan S.

    2009-01-01

    Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms. PMID:16889477

  6. Visualization of Uncertainty

    NASA Astrophysics Data System (ADS)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs

  7. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications. PMID:22042902

  8. Model development and data uncertainty integration

    SciTech Connect

    Swinhoe, Martyn Thomas

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  9. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  10. A review of approaches for communicating uncertainty in radioactive waste disposal programmes

    NASA Astrophysics Data System (ADS)

    McEvoy, Fiona; West, Julie; Bloodworth, Andrew

    2014-05-01

    The technical safety case for a geological repository is based in part on assessments of long-term future behaviour. Technical specialists are required to provide evidence to the greatest extent possible that the predictions are sufficiently reliable for the purpose of making the safety case. This process involves comparison of modelling results with laboratory and field results and with observations on natural and man-made analogue systems. A collection of arguments and evidence are required to help establish the basis for the safety of the repository, as well as to help reduce uncertainty and develop confidence in the analyses themselves. The safety case prepared for a proposed repository must be understood by regulators responsible for scrutinising and judging its acceptability. For the general public, however, it is difficult to make all of the arguments sufficiently transparent and understandable to ensure they share the same level of confidence as the technical specialists. A large body of qualified knowledge resides in the worldwide radioactive waste technical community. This knowledge should provide a firm scientific basis on which the long-term performance and safety of a geological repository can be discussed with confidence so informed decisions can be made. Despite this many countries around the world continue to face difficulties with implementing programmes for the deep geological disposal of radioactive waste. Geology, and effective communication of geological knowledge and uncertainty, are essential parts of the 'tool kit' needed to allow meaningful communication and engagement with the public. These tools can be used to build and maintain public confidence at each step in the process. The search for a geological disposal site is complex, with many stages. At each stage, geological uncertainty will inevitable exist as we will never know everything about the sub-surface unless it is mined out at which point it is of no use as a repository! What level

  11. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue. PMID:14678210

  12. Demonstrating the value of medicines: evolution of value equation and stakeholder perception of uncertainties

    PubMed Central

    Narayanan, Siva

    2016-01-01

    It is important to evaluate how the value of medicine is assessed, as it may have important implications for health technology and reimbursement assessments. The value equation could comprise ‘incremental benefit/outcome’ (relative results of care in terms of patient health, comparing the innovation to best available alternative(s)) in the numerator and ‘cost’ (relative costs involved in the full cycle of care (or a defined period) for the patient's medical condition, incorporating the relevant cost-offsets due to displacement of best available alternative(s)) in the denominator. This ‘relative value’ combined with the overall net budget impact (of including the drug in the formulary or reimbursed drug list) at the concerned population level in the given institution/region/country may better inform the usefulness of the new therapeutic option to the healthcare system. As product value messages are created, anticipating external stakeholder questions and information needs, including addressing three main categories of ‘uncertainties’, namely the scientific uncertainties, usage uncertainties, and financial uncertainties, could facilitate demonstration of optimal product value and help informed decision-making to benefit all stakeholders involved in the process. PMID:27489585

  13. Nature of Science, Scientific Inquiry, and Socio-Scientific Issues Arising from Genetics: A Pathway to Developing a Scientifically Literate Citizenry

    ERIC Educational Resources Information Center

    Lederman, Norman G.; Antink, Allison; Bartos, Stephen

    2014-01-01

    The primary focus of this article is to illustrate how teachers can use contemporary socio-scientific issues to teach students about nature of scientific knowledge as well as address the science subject matter embedded in the issues. The article provides an initial discussion about the various aspects of nature of scientific knowledge that are…

  14. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  15. Measurement uncertainty relations

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-01

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  16. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  17. Weighted Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-03-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation.

  18. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  19. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  20. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  1. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  2. Cloud Feedbacks on Climate: A Challenging Scientific Problem

    ScienceCinema

    Norris, Joe [Scripps Institution of Oceanography, University of California, San Diego, California, USA

    2010-09-01

    One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.

  3. Cloud Feedbacks on Climate: A Challenging Scientific Problem

    SciTech Connect

    Norris, Joe

    2010-05-12

    One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.

  4. Cloud Feedbacks on Climate: A Challenging Scientific Problem

    SciTech Connect

    Norris, Joel

    2010-05-10

    One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.

  5. The National Academy of Sciences offers a new framework for addressing global warming issues.

    PubMed

    Barnard, R C; Morgan, D L

    2000-02-01

    The recent landmark report by the National Academy of Sciences reviewed the science on which the Kyoto Protocol was based. NAS concluded that the policy choices and the mandatory reductions in greenhouse gases by the developed nations were based on incomplete science with significant uncertainties. In view of these uncertainties the NAS report developed a comprehensive strategic 10-year research program to address the basic issue of whether human activity that results in environmental changes is responsible for climate changes. The report provides a new framework for consideration of global warming issues. The UN International Panel on Climate Change (the UN science advisor) in its 1997 report to the Kyoto parties pointed out the confusing difference between scientific usage of the term "climate change" that distinguishes human from natural causes of change and the official usage that combines natural and human causes of changes in climate. The conclusion of the UN panel on human causes is equivocal. The 1999 report of the U.S. Global Science Research Committee also reached an equivocal conclusion on human causes and announced a 10-year research program to be developed in consultation with NAS. The precautionary measures provided in the 1992 UN Framework Convention differ from the ill-defined "precautionary principle" based on fear of uncertainty, and are consistent with the objectives of the NAS proposed research program. These developments together with the third report of the UN Intergovernmental Science Panel on developments in climate science due in 2001 merit consideration by the convention of the parties under the Kyoto Protocol. PMID:10715229

  6. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  7. Uncertainty in NIST Force Measurements

    PubMed Central

    Bartel, Tom

    2005-01-01

    This paper focuses upon the uncertainty of force calibration measurements at the National Institute of Standards and Technology (NIST). The uncertainty of the realization of force for the national deadweight force standards at NIST is discussed, as well as the uncertainties associated with NIST’s voltage-ratio measuring instruments and with the characteristics of transducers being calibrated. The combined uncertainty is related to the uncertainty of dissemination for force transfer standards sent to NIST for calibration. PMID:27308181

  8. Uncertainty law in ambient modal identification-Part I: Theory

    NASA Astrophysics Data System (ADS)

    Au, Siu-Kui

    2014-10-01

    Ambient vibration test has gained increasing popularity in practice as it provides an economical means for modal identification without artificial loading. Since the signal-to-noise ratio cannot be directly controlled, the uncertainty associated with the identified modal parameters is a primary concern. From a scientific point of view, it is of interest to know on what factors the uncertainty depends and what the relationship is. For planning or specification purposes, it is desirable to have an assessment of the test configuration required to achieve a specified accuracy in the modal parameters. For example, what is the minimum data duration to achieve a 30% coefficient of variation (c.o.v.) in the damping ratio? To address these questions, this work investigates the leading order behavior of the ‘posterior uncertainties’ (i.e., given data) of the modal parameters in a Bayesian identification framework. In the context of well-separated modes, small damping and sufficient data, it is shown rigorously that, among other results, the posterior c.o.v. of the natural frequency and damping ratio are asymptotically equal to ( and 1/(2, respectively; where ζ is the damping ratio; Nc is the data length as a multiple of the natural period; Bf and Bζ are data length factors that depend only on the bandwidth utilized for identification, for which explicit expressions have been derived. As the Bayesian approach allows full use of information contained in the data, the results are fundamental characteristics of the ambient modal identification problem. This paper develops the main theory. The companion paper investigates the implication of the results and verification with field test data.

  9. Essentiality, toxicity, and uncertainty in the risk assessment of manganese.

    PubMed

    Boyes, William K

    2010-01-01

    Risk assessments of manganese by inhalation or oral routes of exposure typically acknowledge the duality of manganese as an essential element at low doses and a toxic metal at high doses. Previously, however, risk assessors were unable to describe manganese pharmacokinetics quantitatively across dose levels and routes of exposure, to account for mass balance, and to incorporate this information into a quantitative risk assessment. In addition, the prior risk assessment of inhaled manganese conducted by the U.S. Environmental Protection Agency (EPA) identified a number of specific factors that contributed to uncertainty in the risk assessment. In response to a petition regarding the use of a fuel additive containing manganese, methylcyclopentadienyl manganese tricarbonyl (MMT), the U.S. EPA developed a test rule under the U.S. Clean Air Act that required, among other things, the generation of pharmacokinetic information. This information was intended not only to aid in the design of health outcome studies, but also to help address uncertainties in the risk assessment of manganese. To date, the work conducted in response to the test rule has yielded substantial pharmacokinetic data. This information will enable the generation of physiologically based pharmacokinetic (PBPK) models capable of making quantitative predictions of tissue manganese concentrations following inhalation and oral exposure, across dose levels, and accounting for factors such as duration of exposure, different species of manganese, and changes of age, gender, and reproductive status. The work accomplished in response to the test rule, in combination with other scientific evidence, will enable future manganese risk assessments to consider tissue dosimetry more comprehensively than was previously possible. PMID:20077286

  10. Scientific Word Processors Proliferate.

    ERIC Educational Resources Information Center

    Analytical Chemistry, 1985

    1985-01-01

    Briefly describes most of the currently available scientific word processing software packages. Unless noted, these products (including Molecular Presentation Graphics, ProofWriter, Spellbinder Scientific, Volkswriter Scientific, and WordMARC) run on the IBM PC family of microcomputers. (JN)

  11. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  12. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  13. Uncertainty and nonseparability

    NASA Astrophysics Data System (ADS)

    de La Torre, A. C.; Catuogno, P.; Ferrando, S.

    1989-06-01

    A quantum covariance function is introduced whose real and imaginary parts are related to the independent contributions to the uncertainty principle: noncommutativity of the operators and nonseparability. It is shown that factorizability of states is a sufficient but not necessary condition for separability. It is suggested that all quantum effects could be considered to be a consequence of nonseparability alone.

  14. Asymptotic entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Adamczak, Radosław; Latała, Rafał; Puchała, Zbigniew; Życzkowski, Karol

    2016-03-01

    We analyze entropic uncertainty relations for two orthogonal measurements on a N-dimensional Hilbert space, performed in two generic bases. It is assumed that the unitary matrix U relating both bases is distributed according to the Haar measure on the unitary group. We provide lower bounds on the average Shannon entropy of probability distributions related to both measurements. The bounds are stronger than those obtained with use of the entropic uncertainty relation by Maassen and Uffink, and they are optimal up to additive constants. We also analyze the case of a large number of measurements and obtain strong entropic uncertainty relations, which hold with high probability with respect to the random choice of bases. The lower bounds we obtain are optimal up to additive constants and allow us to prove a conjecture by Wehner and Winter on the asymptotic behavior of constants in entropic uncertainty relations as the dimension tends to infinity. As a tool we develop estimates on the maximum operator norm of a submatrix of a fixed size of a random unitary matrix distributed according to the Haar measure, which are of independent interest.

  15. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  16. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy. PMID:10174798

  17. Innovative Legal Approaches to Address Obesity

    PubMed Central

    Pomeranz, Jennifer L; Teret, Stephen P; Sugarman, Stephen D; Rutkow, Lainie; Brownell, Kelly D

    2009-01-01

    Context: The law is a powerful public health tool with considerable potential to address the obesity issue. Scientific advances, gaps in the current regulatory environment, and new ways of conceptualizing rights and responsibilities offer a foundation for legal innovation. Methods: This article connects developments in public health and nutrition with legal advances to define promising avenues for preventing obesity through the application of the law. Findings: Two sets of approaches are defined: (1) direct application of the law to factors known to contribute to obesity and (2) original and innovative legal solutions that address the weak regulatory stance of government and the ineffectiveness of existing policies used to control obesity. Specific legal strategies are discussed for limiting children's food marketing, confronting the potential addictive properties of food, compelling industry speech, increasing government speech, regulating conduct, using tort litigation, applying nuisance law as a litigation strategy, and considering performance-based regulation as an alternative to typical regulatory actions. Finally, preemption is an overriding issue and can play both a facilitative and a hindering role in obesity policy. Conclusions: Legal solutions are immediately available to the government to address obesity and should be considered at the federal, state, and local levels. New and innovative legal solutions represent opportunities to take the law in creative directions and to link legal, nutrition, and public health communities in constructive ways. PMID:19298420

  18. Professionalism, scientific freedom and dissent: individual and institutional roles and responsibilities in geoethics

    NASA Astrophysics Data System (ADS)

    Bilham, Nic

    2015-04-01

    Debate and dissent are at the heart of scientific endeavour. A diversity of perspectives, alternative interpretations of evidence and the robust defence of competing theories and models drive the advancement of scientific knowledge. Just as importantly, legitimate dissent and diversity of views should not be covered up when offering scientific advice to policy-makers and providing evidence to inform public debate - indeed, they should be valued. We should offer what Andy Stirling has termed 'plural and conditional' scientific advice, not just for the sake of democratic legitimacy, but because it supports better informed and more effective policy-making. 'Monocultures' of scientific advice may have a superficial appeal to policy-makers, but they devalue the contribution of scientists, undermine the resilience of regulatory structures, are often misleading, and can lead to catastrophic policy failure. Furthermore, many of the great societal challenges now facing us require interdisciplinary approaches, across the natural sciences and more widely still, which bring to the fore the need for humility, recognition that we do not have all the answers, and mutual respect for the views of others. In contentious areas such as climate change, extraction of shale gas and radioactive waste disposal, however, such open dialogue may make researchers and practitioners vulnerable to advocates and campaigners who cherry-pick the evidence, misinterpret it, or seek to present scientific uncertainty and debate as mere ignorance. Nor are scientists themselves always above such unethical tactics. The apparent authority conferred on unscrupulous 'campaigning scientists' by their academic and professional credentials may make it all but impossible to distinguish them from those who legitimately make the case for a minority scientific view (and may be marginalised by the mainstream of their discipline in doing so). There is a risk that real scientific debate may be thwarted. Individual

  19. Integrating Scientific Inquiry into an Undergraduate Applied Remote Sensing Course

    NASA Astrophysics Data System (ADS)

    Sivanpillai, R.

    2015-12-01

    Inquiry-based learning (IBL) methods require students to engage in learning activities instead of focusing on learning concepts and facts. Working with the instructor, students have to formulate their research questions, collect and analyze data, and arrive at conclusions. In other words, the focus is shifted from preparing for exams to learning to apply the concepts introduced in the classroom. This experience could result in better understanding of the scientific concepts but instructors have to devote more time for designing and implementing IBL methods in their classroom. At the University of Wyoming, an applied remote sensing course has been taught since 2008. Students enrolled in this course are required to complete a project that is designed around IBL methods. Students do not receive detailed instructions for completing their project, but are trained to develop their own research questions, design an experiment, review literature, and collect, analyze and interpret their data. Additionally they learn about uncertainties and strategies for addressing them at various stages of their project. This presentation will describe the work involved in designing, implementing and mentoring students to successfully complete the course requirements and learn scientific research methods. Lessons learned from this course could provide insights to other instructors interested in implementing IBL or other active learning methods in their classroom.

  20. Uncertainty As a Trigger for a Paradigm Change in Science Communication

    NASA Astrophysics Data System (ADS)

    Schneider, S.

    2014-12-01

    Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.

  1. The Crossroads between Biology and Mathematics: The Scientific Method as the Basics of Scientific Literacy

    ERIC Educational Resources Information Center

    Karsai, Istvan; Kampis, George

    2010-01-01

    Biology is changing and becoming more quantitative. Research is creating new challenges that need to be addressed in education as well. New educational initiatives focus on combining laboratory procedures with mathematical skills, yet it seems that most curricula center on a single relationship between scientific knowledge and scientific method:…

  2. PREDON Scientific Data Preservation 2014

    NASA Astrophysics Data System (ADS)

    Diaconu, C.; Kraml, S.; Surace, C.; Chateigner, D.; Libourel, T.; Laurent, A.; Lin, Y.; Schaming, M.; Benbernou, S.; Lebbah, M.; Boucon, D.; Cérin, C.; Azzag, H.; Mouron, P.; Nief, J.-Y.; Coutin, S.; Beckmann, V.

    Scientific data collected with modern sensors or dedicated detectors exceed very often the perimeter of the initial scientific design. These data are obtained more and more frequently with large material and human efforts. A large class of scientific experiments are in fact unique because of their large scale, with very small chances to be repeated and to superseded by new experiments in the same domain: for instance high energy physics and astrophysics experiments involve multi-annual developments and a simple duplication of efforts in order to reproduce old data is simply not affordable. Other scientific experiments are in fact unique by nature: earth science, medical sciences etc. since the collected data is "time-stamped" and thereby non-reproducible by new experiments or observations. In addition, scientific data collection increased dramatically in the recent years, participating to the so-called "data deluge" and inviting for common reflection in the context of "big data" investigations. The new knowledge obtained using these data should be preserved long term such that the access and the re-use are made possible and lead to an enhancement of the initial investment. Data observatories, based on open access policies and coupled with multi-disciplinary techniques for indexing and mining may lead to truly new paradigms in science. It is therefore of outmost importance to pursue a coherent and vigorous approach to preserve the scientific data at long term. The preservation remains nevertheless a challenge due to the complexity of the data structure, the fragility of the custom-made software environments as well as the lack of rigorous approaches in workflows and algorithms. To address this challenge, the PREDON project has been initiated in France in 2012 within the MASTODONS program: a Big Data scientific challenge, initiated and supported by the Interdisciplinary Mission of the National Centre for Scientific Research (CNRS). PREDON is a study group formed by

  3. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  4. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main

  5. Temporal uncertainty of geographical information

    NASA Astrophysics Data System (ADS)

    Shu, Hong; Qi, Cuihong

    2005-10-01

    Temporal uncertainty is a crossing point of temporal and error-aware geographical information systems. In Geoinformatics, temporal uncertainty is of the same importance as spatial and thematic uncertainty of geographical information. However, until very recently, the standard organizations of ISO/TC211 and FGDC subsequently claimed that temporal uncertainty is one of geospatial data quality elements. Over the past decades, temporal uncertainty of geographical information is modeled insufficiently. To lay down a foundation of logically or physically modeling temporal uncertainty, this paper is aimed to clarify the semantics of temporal uncertainty to some extent. The general uncertainty is conceptualized with a taxonomy of uncertainty. Semantically, temporal uncertainty is progressively classified into uncertainty of time coordinates, changes, and dynamics. Uncertainty of multidimensional time (valid time, database time, and conceptual time, etc.) has been emphasized. It is realized that time scale (granularity) transition may lead to temporal uncertainty because of missing transition details. It is dialectically concluded that temporal uncertainty is caused by the complexity of the human-machine-earth system.

  6. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  7. Uncertainty in mapping urban air quality using crowdsourcing techniques

    NASA Astrophysics Data System (ADS)

    Schneider, Philipp; Castell, Nuria; Lahoz, William; Bartonova, Alena

    2016-04-01

    Small and low-cost sensors measuring various air pollutants have become available in recent years owing to advances in sensor technology. Such sensors have significant potential for improving high-resolution mapping of air quality in the urban environment as they can be deployed in comparatively large numbers and therefore are able to provide information at unprecedented spatial detail. However, such sensor devices are subject to significant and currently little understood uncertainties that affect their usability. Not only do these devices exhibit random errors and biases of occasionally substantial magnitudes, but these errors may also shift over time. In addition, there often tends to be significant inter-sensor variability even when supposedly identical sensors from the same manufacturer are used. We need to quantify accurately these uncertainties to make proper use of the information they provide. Furthermore, when making use of the data and producing derived products such as maps, the measurement uncertainties that propagate throughout the analysis need to be clearly communicated to the scientific and non-scientific users of the map products. Based on recent experiences within the EU-funded projects CITI-SENSE and hackAIR we discuss the uncertainties along the entire processing chain when using crowdsourcing techniques for mapping urban air quality. Starting with the uncertainties exhibited by the sensors themselves, we present ways of quantifying the error characteristics of a network of low-cost microsensors and show suitable statistical metrics for summarizing them. Subsequently, we briefly present a data-fusion-based method for mapping air quality in the urban environment and illustrate how we propagate the uncertainties of the individual sensors throughout the mapping system, resulting in detailed maps that document the pixel-level uncertainty for each concentration field. Finally, we present methods for communicating the resulting spatial uncertainty

  8. Addressing health literacy in patient decision aids

    PubMed Central

    2013-01-01

    Background Effective use of a patient decision aid (PtDA) can be affected by the user’s health literacy and the PtDA’s characteristics. Systematic reviews of the relevant literature can guide PtDA developers to attend to the health literacy needs of patients. The reviews reported here aimed to assess: 1. a) the effects of health literacy / numeracy on selected decision-making outcomes, and b) the effects of interventions designed to mitigate the influence of lower health literacy on decision-making outcomes, and 2. the extent to which existing PtDAs a) account for health literacy, and b) are tested in lower health literacy populations. Methods We reviewed literature for evidence relevant to these two aims. When high-quality systematic reviews existed, we summarized their evidence. When reviews were unavailable, we conducted our own systematic reviews. Results Aim 1: In an existing systematic review of PtDA trials, lower health literacy was associated with lower patient health knowledge (14 of 16 eligible studies). Fourteen studies reported practical design strategies to improve knowledge for lower health literacy patients. In our own systematic review, no studies reported on values clarity per se, but in 2 lower health literacy was related to higher decisional uncertainty and regret. Lower health literacy was associated with less desire for involvement in 3 studies, less question-asking in 2, and less patient-centered communication in 4 studies; its effects on other measures of patient involvement were mixed. Only one study assessed the effects of a health literacy intervention on outcomes; it showed that using video to improve the salience of health states reduced decisional uncertainty. Aim 2: In our review of 97 trials, only 3 PtDAs overtly addressed the needs of lower health literacy users. In 90% of trials, user health literacy and readability of the PtDA were not reported. However, increases in knowledge and informed choice were reported in those studies

  9. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  10. Hendra in the news: public policy meets public morality in times of zoonotic uncertainty.

    PubMed

    Degeling, Chris; Kerridge, Ian

    2013-04-01

    Public discourses have influence on policymaking for emerging health issues. Media representations of unfolding events, scientific uncertainty, and real and perceived risks shape public acceptance of health policy and therefore policy outcomes. To characterize and track views in popular circulation on the causes, consequences and appropriate policy responses to the emergence of Hendra virus as a zoonotic risk, this study examines coverage of this issue in Australian mass media for the period 2007-2011. Results demonstrate the predominant explanation for the emergence of Hendra became the encroachment of flying fox populations on human settlement. Depictions of scientific uncertainty as to whom and what was at risk from Hendra virus promoted the view that flying foxes were a direct risk to human health. Descriptions of the best strategy to address Hendra have become polarized between recognized health authorities advocating individualized behaviour changes to limit risk exposure; versus populist calls for flying fox control and eradication. Less than a quarter of news reports describe the ecological determinants of emerging infectious disease or upstream policy solutions. Because flying foxes rather than horses were increasingly represented as the proximal source of human infection, existing policies of flying fox protection became equated with government inaction; the plight of those affected by flying foxes representative of a moral failure. These findings illustrate the potential for health communications for emerging infectious disease risks to become entangled in other political agendas, with implications for the public's likelihood of supporting public policy and risk management strategies that require behavioural change or seek to address the ecological drivers of incidence. PMID:23294874

  11. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  12. Plate tectonics: Scientific revolution or scientific program?

    NASA Astrophysics Data System (ADS)

    Mareschal, Jean-Claude

    In The Structure of Scientific Revolutions, Thomas S. Kuhn suggested that science progresses discontinuously: As a scientific theory becomes obsolete, a period of crisis results, at the end of which the old theory is overthrown and replaced by a new, sounder, more complete theory [Kuhn, 1962]. After the scientific community has accepted the new [paradigm,] it undertakes only routine research until a new crisis occurs, usually as a result of an anomalous experiment that accidentally happens to be critical.

  13. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  14. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  15. Tutorial examples for uncertainty quantification methods.

    SciTech Connect

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  16. "So a Frackademic and an Environmentalist Walk into an Error Bar...": Communicating Uncertainty Amidst Controversy

    NASA Astrophysics Data System (ADS)

    Kroepsch, A.

    2013-12-01

    above. In striving to separate 'signal' from 'noise' in the public discourse, we have experimented with literary devices (metaphor and narrative), pedagogical tools (the 'what we know, what we don't know, and what we hope to learn' format), journalistic practices (the humanizing profile), and, perhaps most importantly, disarming delivery techniques (humor). In describing these methods, and their effectiveness at addressing scientific uncertainty, the author will be sure to acknowledge the uncertainties inherent therein.

  17. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  18. Multi-scenario modelling of uncertainty in stochastic chemical systems

    SciTech Connect

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-09-15

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo.

  19. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management. PMID:27450905

  20. Managing Uncertainty in Data and Models: UncertWeb

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Cornford, D.; Pebesma, E. J.

    2010-12-01

    There is an increasing recognition that issues of quality, error and uncertainty are central concepts to both scientific progress and practical decision making. Recent moves towards evidence driven policy and complex, uncertain scientific investigations into climate change and its likely impacts have heightened the awareness that uncertainty is critical in linking our observations and models to reality. The most natural, principled framework is provided by Bayesian approaches, which recognise a variety of sources of uncertainty such as aleatory (variability), epistemic (lack of knowledge) and possibly ontological (lack of agreed definitions). Most current information models used in the geosciences do not fully support the communication of uncertain results, although some do provide limited support for quality information in metadata. With the UncertWeb project (http://www.uncertweb.org), involving statisticians, geospatial and application scientists and informaticians we are developing a framework for representing and communicating uncertainty in observational data and models which builds on existing standards such as the Observations and Measurements conceptual model, and related Open Geospatial Consortium and ISO standards to allow the communication and propagation of uncertainty in chains of model services. A key component is the description of uncertainties in observational data, based on a revised version of UncertML, a conceptual model and encoding for representing uncertain quantities. In this talk we will describe how we envisage using UncertML with existing standards to describe the uncertainty in observational data and how this uncertainty information can then be propagated through subsequent analysis. We will highlight some of the tools which we are developing within UncertWeb to support the management of uncertainty in web based geoscientific applications.

  1. Nature of Science, Scientific Inquiry, and Socio-Scientific Issues Arising from Genetics: A Pathway to Developing a Scientifically Literate Citizenry

    NASA Astrophysics Data System (ADS)

    Lederman, Norman G.; Antink, Allison; Bartos, Stephen

    2012-06-01

    The primary focus of this article is to illustrate how teachers can use contemporary socio-scientific issues to teach students about nature of scientific knowledge as well as address the science subject matter embedded in the issues. The article provides an initial discussion about the various aspects of nature of scientific knowledge that are addressed. It is important to remember that the aspects of nature of scientific knowledge are not considered to be a comprehensive list, but rather a set of important ideas for adolescent students to learn about scientific knowledge. These ideas have been advocated as important for secondary students by numerous reform documents internationally. Then, several examples are used to illustrate how genetically based socio-scientific issues can be used by teachers to improve students' understandings of the discussed aspects of nature of scientific knowledge.

  2. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  3. New Programming Environments for Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E. P.; Banta, E. R.; Christensen, S.; Cooley, R. L.; Ely, D. M.; Babendreier, J.; Leavesley, G.; Tonkin, M.; Julich, R.

    2005-12-01

    We live in a world of faster computers, better GUI's and visualization technology, increasing international cooperation made possible by new digital infrastructure, new agreements between US federal agencies (such as ISCMEM), new European Union programs (such as Harmoniqua), and greater collaboration between US university scientists through CUAHSI. These changes provide new resources for tackling the difficult job of quantifying how well our models perform. This talk introduces new programming environments that take advantage of these new developments and will change the paradigm of how we develop methods for uncertainty evaluation. For example, the programming environments provided by COSU API, JUPITER API, and Sensitivity/Optimization Toolbox provide enormous opportunities for faster and more meaningful evaluation of uncertainties. Instead of waiting years for ideas and theories to be compared in the complex circumstances of interest to resource managers, these new programming environments will expedite the process. In the new paradigm, unproductive ideas and theories will be revealed more quickly, productive ideas and theories will more quickly be used to address our increasingly difficult water resources problems. As examples, two ideas in JUPITER API applications are presented: uncertainty correction factors that account for system complexities not represented in models, and PPR and OPR statistics used to identify new data needed to reduce prediction uncertainty.

  4. Using Models that Incorporate Uncertainty

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.

    2002-01-01

    In this article, the author discusses the use in policy analysis of models that incorporate uncertainty. He believes that all models should consider incorporating uncertainty, but that at the same time it is important to understand that sampling variability is not usually the dominant driver of uncertainty in policy analyses. He also argues that…

  5. Sensitivity of Flow Uncertainty to Radar Rainfall Uncertainty in the Context of Operational Distributed Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Carpenter, T. M.; Georgakakos, K. P.; Georgakakos, K. P.

    2001-12-01

    The current study focuses on the sensitivity of distributed model flow forecast uncertainty to the uncertainty in the radar rainfall input. Various studies estimate a 30 to 100% uncertainty in radar rainfall estimates from the operational NEXRAD radars. This study addresses the following questions: How does this uncertainty in rainfall input impact the flow simulations produced by a hydrologic model? How does this effect compare to the uncertainty in flow forecasts resulting from initial condition and model parametric uncertainty? The hydrologic model used, HRCDHM, is a catchment-based, distributed hydrologic model and accepts hourly precipitation input from the operational WSR-88D weather radar. A GIS is used to process digital terrain data, delineate sub-catchments of a given large watershed, and supply sub-catchment characteristics (subbasin area, stream length, stream slope and channel-network topology) to the hydrologic model components. HRCDHM uses an adaptation of the U.S. NWS operational Sacramento soil moisture accounting model to produce runoff for each sub-catchment within the larger study watershed. Kinematic or Muskingum-Cunge channel routing is implemented to combine and route sub-catchment flows through the channel network. Available spatial soils information is used to vary hydrologic model parameters from sub-catchment to sub-catchment. HRCDHM was applied to the 2,500 km2 Illinois River watershed in Arkansas and Oklahoma with outlet at Tahlequah, Oklahoma. The watershed is under the coverage of the operational WSR-88D radar at Tulsa, Oklahoma. For distributed modeling, the watershed area has been subdivided into sub-catchments with an average area of 80km2. Flow simulations are validated at various gauged locations within the watershed. A Monte Carlo framework was used to assess the sensitivity of the simulated flows to uncertainty in radar input for different radar error distributions (uniform or exponential), and to make comparisons to the flow

  6. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    SciTech Connect

    Díez, C.J.; Cabellos, O.; Martínez, J.S.

    2015-01-15

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  7. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  8. Addressing problems of employee performance.

    PubMed

    McConnell, Charles R

    2011-01-01

    Employee performance problems are essentially of 2 kinds: those that are motivational in origin and those resulting from skill deficiencies. Both kinds of problems are the province of the department manager. Performance problems differ from problems of conduct in that traditional disciplinary processes ordinarily do not apply. Rather, performance problems are addressed through educational and remedial processes. The manager has a basic responsibility in ensuring that everything reasonable is done to help each employee succeed. There are a number of steps the manager can take to address employee performance problems. PMID:21537142

  9. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  10. Communicating uncertainties in assessments of future sea level rise

    NASA Astrophysics Data System (ADS)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing

  11. Uncertainty and complexity in personal health records.

    PubMed

    Hudson, Donna L; Cohen, Maurice E

    2010-01-01

    New technologies in medicine have led to an explosion in the number of parameters that must be considered when diagnosing and treating a patient. Because of this high volume of data it is not possible for the human decision maker to take all information into account in arriving at a decision. Automated methods are needed to effectively evaluate electronic information in many formats and provide summaries to the medical professional. The task is complicated by the complexity of the data and the potential uncertainty of some of the results. In this article complexity and uncertainty in medical data are discussed in terms of both representation and types of analysis. Methods that can address multiple complex data types are illustrated and examples are provided for specific medical problems. These methods are particularly important for automated trend analysis in the personal health record as small errors can be propagated through the complex system resulting in incorrect diagnosis and treatment. PMID:21095837

  12. Theoretical uncertainties in proton lifetime estimates

    NASA Astrophysics Data System (ADS)

    Kolešová, Helena; Malinský, Michal; Mede, Timon

    2016-06-01

    We recapitulate the primary sources of theoretical uncertainties in proton lifetime estimates in renormalizable, four-dimensional & non-supersymmetric grand unifications that represent the most conservative framework in which this question may be addressed at the perturbative level. We point out that many of these uncertainties are so severe and often even irreducible that there are only very few scenarios in which an NLO approach, as crucial as it is for a real testability of any specific model, is actually sensible. Among these, the most promising seems to be the minimal renormalizable SO(10) GUT whose high-energy gauge symmetry is spontaneously broken by the adjoint and the five-index antisymmetric irreducible representations.

  13. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  14. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  15. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  16. Load Balancing Scientific Applications

    SciTech Connect

    Pearce, Olga Tkachyshyn

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

  17. Addressing Phonological Questions with Ultrasound

    ERIC Educational Resources Information Center

    Davidson, Lisa

    2005-01-01

    Ultrasound can be used to address unresolved questions in phonological theory. To date, some studies have shown that results from ultrasound imaging can shed light on how differences in phonological elements are implemented. Phenomena that have been investigated include transitional schwa, vowel coalescence, and transparent vowels. A study of…

  18. Communities Address Barriers to Connectivity.

    ERIC Educational Resources Information Center

    Byers, Anne

    1996-01-01

    Rural areas lag behind urban areas in access to information technologies. Public institutions play a critical role in extending the benefits of information technologies to those who would not otherwise have access. The most successful rural telecommunications plans address barriers to use, such as unawareness of the benefits, technophobia, the…

  19. Keynote Address: Rev. Mark Massa

    ERIC Educational Resources Information Center

    Massa, Mark S.

    2011-01-01

    Rev. Mark S. Massa, S.J., is the dean and professor of Church history at the School of Theology and Ministry at Boston College. He was invited to give a keynote to begin the third Catholic Higher Education Collaborative Conference (CHEC), cosponsored by Boston College and Fordham University. Fr. Massa's address posed critical questions about…

  20. State of the Lab Address

    ScienceCinema

    King, Alex

    2013-03-01

    In his third-annual State of the Lab address, Ames Laboratory Director Alex King called the past year one of "quiet but strong progress" and called for Ames Laboratory to continue to build on its strengths while responding to changing expectations for energy research.

  1. State of the Lab Address

    SciTech Connect

    King, Alex

    2010-01-01

    In his third-annual State of the Lab address, Ames Laboratory Director Alex King called the past year one of "quiet but strong progress" and called for Ames Laboratory to continue to build on its strengths while responding to changing expectations for energy research.

  2. Addressing Extremes within the WCRP - GEWEX Framework

    NASA Astrophysics Data System (ADS)

    van Oevelen, P. J.; Stewart, R.; Detemmerman, V.

    2008-12-01

    For large international coordination programs such as the Global Energy and Water Cycle Experiment (GEWEX) as part of the World Climate Research Programme (WCRP) it is difficult to strike a good balance between enabling as much international involvement as is possible and desirable and the achievability of the objectives. WCRP has decided that "Extremes Research" is one of several areas where it would like to see its efforts strengthened and scientific research pushed forward. The foci that are being selected should be phrased such that they are practical and achievable within a time span of 1 to 3 years. Preferably these foci build upon the expertise from cross WCRP activities and are not restricted to single core project activities. In this presentation an overview will be given of the various activities within GEWEX that are related to extremes and which ones would be most ideal to be addressed as WCRP foci from a GEWEX perspective. The rationale and context of extreme research will be presented as well links to other national and international programs. "Extremes Research" as a topic is attractive since it has a high societal relevance and impact. However, numerous definitions of extremes exist and they are being used in widely varying contexts making it not always clear of what exactly is being addressed. This presentation will give an outlook on what can be expected research wise in the near future based upon the outcomes of the Extremes Workshop organised last June in Vancouver in the context of the Coordinated Energy and water cycle Observations Project (CEOP) as part of GEWEX. In particular it will be shown how these activities, which will only address certain types of extremes, can be linked to adaptation and mitigation efforts taking place in other organisations and by national and international bodies.

  3. Addressing Risks to Advance Mental Health Research

    PubMed Central

    Iltis, Ana S.; Misra, Sahana; Dunn, Laura B.; Brown, Gregory K.; Campbell, Amy; Earll, Sarah A.; Glowinski, Anne; Hadley, Whitney B.; Pies, Ronald; DuBois, James M.

    2015-01-01

    Objective Risk communication and management are essential to the ethical conduct of research, yet addressing risks may be time consuming for investigators and institutional review boards (IRBs) may reject study designs that appear too risky. This can discourage needed research, particularly in higher risk protocols or those enrolling potentially vulnerable individuals, such as those with some level of suicidality. Improved mechanisms for addressing research risks may facilitate much needed psychiatric research. This article provides mental health researchers with practical approaches to: 1) identify and define various intrinsic research risks; 2) communicate these risks to others (e.g., potential participants, regulatory bodies, society); 3) manage these risks during the course of a study; and 4) justify the risks. Methods As part of a National Institute of Mental Health (NIMH)-funded scientific meeting series, a public conference and a closed-session expert panel meeting were held on managing and disclosing risks in mental health clinical trials. The expert panel reviewed the literature with a focus on empirical studies and developed recommendations for best practices and further research on managing and disclosing risks in mental health clinical trials. IRB review was not required because there were no human subjects. The NIMH played no role in developing or reviewing the manuscript. Results Challenges, current data, practical strategies, and topics for future research are addressed for each of four key areas pertaining to management and disclosure of risks in clinical trials: identifying and defining risks, communicating risks, managing risks during studies, and justifying research risks. Conclusions Empirical data on risk communication, managing risks, and the benefits of research can support the ethical conduct of mental health research and may help investigators better conceptualize and confront risks and to gain IRB approval. PMID:24173618

  4. Characterizing Uncertainty in Epidemiological Studies for use in Human Health Risk Assessment

    EPA Science Inventory

    Characterization of scientific uncertainty can provide risk assessments with a level of confidence regarding decisions, whichallows for evaluation of the degree that uncertainty plays in the analysis of consequences of specific policies.To the best of our knowledge, there are no ...

  5. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  6. The maintenance of uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ι-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary

  7. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  8. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  9. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  10. Going public: good scientific conduct.

    PubMed

    Meyer, Gitte; Sandøe, Peter

    2012-06-01

    The paper addresses issues of scientific conduct regarding relations between science and the media, relations between scientists and journalists, and attitudes towards the public at large. In the large and increasing body of literature on scientific conduct and misconduct, these issues seem underexposed as ethical challenges. Consequently, individual scientists here tend to be left alone with problems and dilemmas, with no guidance for good conduct. Ideas are presented about how to make up for this omission. Using a practical, ethical approach, the paper attempts to identify ways scientists might deal with ethical public relations issues, guided by a norm or maxim of openness. Drawing on and rethinking the CUDOS codification of the scientific ethos, as it was worked out by Robert K. Merton in 1942, we propose that this, which is echoed in current codifications of norms for good scientific conduct, contains a tacit maxim of openness which may naturally be extended to cover the public relations of science. Discussing openness as access, accountability, transparency and receptiveness, the argumentation concentrates on the possible prevention of misconduct with respect to, on the one hand, sins of omission-withholding important information from the public-and, on the other hand, abuses of the authority of science in order to gain publicity. Statements from interviews with scientists are used to illustrate how scientists might view the relevance of the issues raised. PMID:21088921

  11. Scientific Assistant Virtual Laboratory (SAVL)

    NASA Astrophysics Data System (ADS)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  12. Communicating uncertainties in earth sciences in view of user needs

    NASA Astrophysics Data System (ADS)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to

  13. Scientific issues in radiation dose reconstruction.

    PubMed

    Toohey, Richard E

    2008-07-01

    Stakeholders have raised numerous issues regarding the scientific basis of radiation dose reconstruction for compensation. These issues can be grouped into three broad categories: data issues, dosimetry issues, and compensation issues. Data issues include demographic data of the worker, changes in site operations over time (both production and exposure control), characterization of episodic vs. chronic exposures, and the use of coworker data. Dosimetry issues include methods for assessment of ambient exposures, missed dose, unmonitored dose, and medical x-ray dose incurred as a condition of employment. Specific issues related to external dose include the sensitivity, angular and energy dependence of personal monitors, exposure geometries, and the accompanying uncertainties. Those related to internal dose include sensitivity of bioassay methods, uncertainties in biokinetic models, appropriate dose coefficients, and modeling uncertainties. Compensation issues include uncertainties in the risk models and use of the 99th percentile of the distribution of probability of causation for awarding compensation. A review of the scientific literature and analysis of each of these issues distinguishes factors that play a major role in the compensation decision from those that do not. PMID:18545027

  14. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  15. Keynote Address: Science Since the Medicean Stars and the Beagle

    NASA Astrophysics Data System (ADS)

    Partridge, B.; Hillenbrand, L. A.; Grinspoon, D.

    2010-08-01

    In 2009, the world celebrates both the International Year of Astronomy (IYA), commemorating the 400th anniversary of Galileo's first observations of the heavens with his telescope, and the 200th anniversary of the birth of Charles Darwin and the 150th anniversary of the publication of his Origin of Species, a key impetus for the 2009 Year of Science. In this keynote address, the three presenters (distinguished scientists themselves) will reflect on how these recent centuries of astronomical and scientific discovery have changed our perspectives about the universe, the natural world, and ourselves—and underpin our education and public outreach efforts to help ensure continued scientific advance in the future.

  16. WWW: The Scientific Method

    ERIC Educational Resources Information Center

    Blystone, Robert V.; Blodgett, Kevin

    2006-01-01

    The scientific method is the principal methodology by which biological knowledge is gained and disseminated. As fundamental as the scientific method may be, its historical development is poorly understood, its definition is variable, and its deployment is uneven. Scientific progress may occur without the strictures imposed by the formal…

  17. 3 CFR - Scientific Integrity

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3 The President 1 2010-01-01 2010-01-01 false Scientific Integrity Presidential Documents Other Presidential Documents Memorandum of March 9, 2009 Scientific Integrity Memorandum for the Heads of Executive Departments and Agencies Science and the scientific process must inform and guide decisions of my Administration on a wide range of...

  18. Scientific Literacy: Whose Responsibility?

    ERIC Educational Resources Information Center

    Evans, Thomas P.

    1970-01-01

    Identifies various components of scientific literacy and characteristics of scientifically literate people. Discusses factors inhibiting scientific literacy. Suggested remedies: federal support for special programs, redesign of teacher education programs and science content courses at all levels, and setting up means of interpreting science to the…

  19. Redefining the "Scientific Method".

    ERIC Educational Resources Information Center

    Spiece, Kelly R.; Colosi, Joseph

    2000-01-01

    Surveys 15 introductory biology textbooks for their presentation of the scientific method. Teaching the scientific method involves more than simplified steps and subjectivity--human politics, cultural influences, and chance are all a part of science. Presents an activity for students to experience the scientific method. (Contains 34 references.)…

  20. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    SciTech Connect

    Elderkin, C.E. ); Kelly, G.N. )

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  1. QUANTIFYING UNCERTAINTY IN LONG RANGE TRANSPORT MODELS: WORKSHOP REPORT ON SOURCES AND EVALUATION OF UNCERTAINTY IN LONG-RANGE TRANSPORT MODELS

    EPA Science Inventory

    The quantification of uncertainty in long-range transport model predictions and the implications of these uncertainties on formulations of control policy have been the subject of investigations by both the United States and Canada. To more fully address these topics, the American...

  2. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  3. Risk communication: Uncertainties and the numbers game

    SciTech Connect

    Ortigara, M.

    1995-08-30

    The science of risk assessment seeks to characterize the potential risk in situations that may pose hazards to human health or the environment. However, the conclusions reached by the scientists and engineers are not an end in themselves - they are passed on to the involved companies, government agencies, legislators, and the public. All interested parties must then decide what to do with the information. Risk communication is a type of technical communication that involves some unique challenges. This paper first defines the relationships between risk assessment, risk management, and risk communication and then explores two issues in risk communication: addressing uncertainty and putting risk number into perspective.

  4. Uncertainties in container failure time predictions

    SciTech Connect

    Williford, R.E.

    1990-01-01

    Stochastic variations in the local chemical environment of a geologic waste repository can cause corresponding variations in container corrosion rates and failure times, and thus in radionuclide release rates. This paper addresses how well the future variations in repository chemistries must be known in order to predict container failure times that are bounded by a finite time period within the repository lifetime. Preliminary results indicate that a 5000 year scatter in predicted container failure times requires that repository chemistries be known to within {plus minus}10% over the repository lifetime. These are small uncertainties compared to current estimates. 9 refs., 3 figs.

  5. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  6. Living with uncertainty

    SciTech Connect

    Rau, N.; Fong, C.C.; Grigg, C.H.; Silverstein, B.

    1994-11-01

    In the electric utility industry, only one thing can be guaranteed with absolute certainty: one lives and works with many unknowns. Thus, the industry has embraced probability methods to varying degrees over the last 25 years. These techniques aid decision makers in planning, operations, and maintenance by quantifying uncertainty. Examples include power system reliability, production costing simulation, and assessment of environmental factors. A series of brainstorming sessions was conducted by the Application of Probability Methods (APM) Subcommittee of the IEEE Power Engineering Society to identify research and development needs and to ask the question, ''where should we go from here '' The subcommittee examined areas of need in data development, applications, and methods for decision making. The purpose of this article is to share the thoughts of APM members with a broader audience to the findings and to invite comments and participation.

  7. Addressing the vaccine confidence gap.

    PubMed

    Larson, Heidi J; Cooper, Louis Z; Eskola, Juhani; Katz, Samuel L; Ratzan, Scott

    2011-08-01

    Vaccines--often lauded as one of the greatest public health interventions--are losing public confidence. Some vaccine experts have referred to this decline in confidence as a crisis. We discuss some of the characteristics of the changing global environment that are contributing to increased public questioning of vaccines, and outline some of the specific determinants of public trust. Public decision making related to vaccine acceptance is neither driven by scientific nor economic evidence alone, but is also driven by a mix of psychological, sociocultural, and political factors, all of which need to be understood and taken into account by policy and other decision makers. Public trust in vaccines is highly variable and building trust depends on understanding perceptions of vaccines and vaccine risks, historical experiences, religious or political affiliations, and socioeconomic status. Although provision of accurate, scientifically based evidence on the risk-benefit ratios of vaccines is crucial, it is not enough to redress the gap between current levels of public confidence in vaccines and levels of trust needed to ensure adequate and sustained vaccine coverage. We call for more research not just on individual determinants of public trust, but on what mix of factors are most likely to sustain public trust. The vaccine community demands rigorous evidence on vaccine efficacy and safety and technical and operational feasibility when introducing a new vaccine, but has been negligent in demanding equally rigorous research to understand the psychological, social, and political factors that affect public trust in vaccines. PMID:21664679

  8. Back to the future: The Grassroots of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, K. A.

    2013-12-01

    Uncertainties are widespread within hydrological science, and as society is looking to models to provide answers as to how climate change may affect our future water resources, the performance of hydrological models should be evaluated. With uncertainties being introduced from input data, parameterisation, model structure, validation data, and ';unknown unknowns' it is easy to be pessimistic about model outputs. But uncertainties are an opportunity for scientific endeavour, not a threat. Investigation and suitable presentation of uncertainties, which results in a range of potential outcomes, provides more insight into model projections than just one answer. This paper aims to demonstrate the feasibility of conducting computationally demanding parameter uncertainty estimation experiments on global hydrological models (GHMs). Presently, individual GHMs tend to present their one, best projection, but this leads to spurious precision - a false impression of certainty - which can be misleading to decision makers. Whilst uncertainty estimation is firmly established in catchment hydrology, GHM uncertainty, and parameter uncertainty in particular, has remained largely overlooked. Model inter-comparison studies that investigate model structure uncertainty have been undertaken (e.g. ISI-MIP, EU-WATCH etc.), but these studies seem premature when the uncertainties within each individual model itself have not yet been considered. This study takes a few steps back, going down to one of the first introductions of assumptions in model development, the assignment of model parameter values. Making use of the University of Nottingham's High Performance Computer Cluster (HPC), the Mac-PDM.09 GHM has been subjected to rigorous uncertainty experiments. The Generalised Likelihood Uncertainty Estimation method (GLUE) with Latin Hypercube Sampling has been applied to a GHM for the first time, to produce 100,000 simultaneous parameter perturbations. The results of this ensemble of 100

  9. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  10. The Study of Address Tree Coding Based on the Maximum Matching Algorithm in Courier Business

    NASA Astrophysics Data System (ADS)

    Zhou, Shumin; Tang, Bin; Li, Wen

    As an important component of EMS monitoring system, address is different from user name with great uncertainty because there are many ways to represent it. Therefore, address standardization is a difficult task. Address tree coding has been trying to resolve that issue for many years. Zip code, as its most widely used algorithm, can only subdivide the address down to a designated post office, not the recipients' address. This problem needs artificial identification method to be accurately delivered. This paper puts forward a new encoding algorithm of the address tree - the maximum matching algorithm to solve the problem. This algorithm combines the characteristics of the address tree and the best matching theory, and brings in the associated layers of tree nodes to improve the matching efficiency. Taking the variability of address into account, the thesaurus of address tree should be updated timely by increasing new nodes automatically through intelligent tools.

  11. I Am Sure There May Be a Planet There: Student Articulation of Uncertainty in Argumentation Tasks

    ERIC Educational Resources Information Center

    Buck, Zoë E.; Lee, Hee-Sun; Flores, Joanna

    2014-01-01

    We investigated how students articulate uncertainty when they are engaged in structured scientific argumentation tasks where they generate, examine, and interpret data to determine the existence of exoplanets. In this study, 302 high school students completed 4 structured scientific arguments that followed a series of computer-model-based…

  12. Impact of discharge data uncertainty on nutrient load uncertainty

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  13. Uncertainty in Regional Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Digar, Antara

    concentrations (oxides of nitrogen) have been used to adjust probabilistic estimates of pollutant sensitivities based on the performance of simulations in reliably reproducing ambient measurements. Various observational metrics have been explored for better scientific understanding of how sensitivity estimates vary with measurement constraints. Future work could extend these methods to incorporate additional modeling uncertainties and alternate observational metrics, and explore the responsiveness of future air quality to project trends in emissions and climate change.

  14. A framework for modeling anthropogenic impacts on waterbird habitats: addressing future uncertainty in conservation planning

    USGS Publications Warehouse

    Matchett, Elliott L.; Fleskes, Joseph P.; Young, Charles A.; Purkey, David R.

    2015-01-01

    The amount and quality of natural resources available for terrestrial and aquatic wildlife habitats are expected to decrease throughout the world in areas that are intensively managed for urban and agricultural uses. Changes in climate and management of increasingly limited water supplies may further impact water resources essential for sustaining habitats. In this report, we document adapting a Water Evaluation and Planning (WEAP) system model for the Central Valley of California. We demonstrate using this adapted model (WEAP-CVwh) to evaluate impacts produced from plausible future scenarios on agricultural and wetland habitats used by waterbirds and other wildlife. Processed output from WEAP-CVwh indicated varying levels of impact caused by projected climate, urbanization, and water supply management in scenarios used to exemplify this approach. Among scenarios, the NCAR-CCSM3 A2 climate projection had a greater impact than the CNRM-CM3 B1 climate projection, whereas expansive urbanization had a greater impact than strategic urbanization, on annual availability of waterbird habitat. Scenarios including extensive rice-idling or substantial instream flow requirements on important water supply sources produced large impacts on annual availability of waterbird habitat. In the year corresponding with the greatest habitat reduction for each scenario, the scenario including instream flow requirements resulted in the greatest decrease in habitats throughout all months of the wintering period relative to other scenarios. This approach provides a new and useful tool for habitat conservation planning in the Central Valley and a model to guide similar research investigations aiming to inform conservation, management, and restoration of important wildlife habitats.

  15. Using Robust Decision Making to Address Climate Change Uncertainties in Water Quality Management

    EPA Science Inventory

    Results of robust decision making simulations show that both climate and land use change will need to be taken into account in order to implement BMP strategies that are more likely to meet the goals for the Patuxent river for both Phosphorus and Nitrogen.

  16. Evaluating Health Risks from Inhaled Polychlorinated Biphenyls: Research Needs for Addressing Uncertainty

    EPA Science Inventory

    Indoor air polychlorinated biphenyl (PCB) concentrations in some U.S. schools are one or more orders of magnitude higher than background levels. In response to this, efforts have been made to assess the potential health risk posed by inhaled PCBs. These efforts are hindered by un...

  17. Soliciting scientific information and beliefs in predictive modeling and adaptive management

    NASA Astrophysics Data System (ADS)

    Glynn, P. D.; Voinov, A. A.; Shapiro, C. D.

    2015-12-01

    Post-normal science requires public engagement and adaptive corrections in addressing issues with high complexity and uncertainty. An adaptive management framework is presented for the improved management of natural resources and environments through a public participation process. The framework solicits the gathering and transformation and/or modeling of scientific information but also explicitly solicits the expression of participant beliefs. Beliefs and information are compared, explicitly discussed for alignments or misalignments, and ultimately melded back together as a "knowledge" basis for making decisions. An effort is made to recognize the human or participant biases that may affect the information base and the potential decisions. In a separate step, an attempt is made to recognize and predict the potential "winners" and "losers" (perceived or real) of any decision or action. These "winners" and "losers" include present human communities with different spatial, demographic or socio-economic characteristics as well as more dispersed or more diffusely characterized regional or global communities. "Winners" and "losers" may also include future human communities as well as communities of other biotic species. As in any adaptive management framework, assessment of predictions, iterative follow-through and adaptation of policies or actions is essential, and commonly very difficult or impossible to achieve. Recognizing beforehand the limits of adaptive management is essential. More generally, knowledge of the behavioral and economic sciences and of ethics and sociology will be key to a successful implementation of this adaptive management framework. Knowledge of biogeophysical processes will also be essential, but by definition of the issues being addressed, will always be incomplete and highly uncertain. The human dimensions of the issues addressed and the participatory processes used carry their own complexities and uncertainties. Some ideas and principles are

  18. New approaches to uncertainty analysis for use in aggregate and cumulative risk assessment of pesticides.

    PubMed

    Kennedy, Marc C; van der Voet, Hilko; Roelofs, Victoria J; Roelofs, Willem; Glass, C Richard; de Boer, Waldo J; Kruisselbrink, Johannes W; Hart, Andy D M

    2015-05-01

    Risk assessments for human exposures to plant protection products (PPPs) have traditionally focussed on single routes of exposure and single compounds. Extensions to estimate aggregate (multi-source) and cumulative (multi-compound) exposure from PPPs present many new challenges and additional uncertainties that should be addressed as part of risk analysis and decision-making. A general approach is outlined for identifying and classifying the relevant uncertainties and variabilities. The implementation of uncertainty analysis within the MCRA software, developed as part of the EU-funded ACROPOLIS project to address some of these uncertainties, is demonstrated. An example is presented for dietary and non-dietary exposures to the triazole class of compounds. This demonstrates the chaining of models, linking variability and uncertainty generated from an external model for bystander exposure with variability and uncertainty in MCRA dietary exposure assessments. A new method is also presented for combining pesticide usage survey information with limited residue monitoring data, to address non-detect uncertainty. The results show that incorporating usage information reduces uncertainty in parameters of the residue distribution but that in this case quantifying uncertainty is not a priority, at least for UK grown crops. A general discussion of alternative approaches to treat uncertainty, either quantitatively or qualitatively, is included. PMID:25688423

  19. Uncertainty and Anticipation in Anxiety

    PubMed Central

    Grupe, Dan W.; Nitschke, Jack B.

    2014-01-01

    Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199

  20. Nanoscale content-addressable memory

    NASA Technical Reports Server (NTRS)

    Davis, Bryan (Inventor); Principe, Jose C. (Inventor); Fortes, Jose (Inventor)

    2009-01-01

    A combined content addressable memory device and memory interface is provided. The combined device and interface includes one or more one molecular wire crossbar memories having spaced-apart key nanowires, spaced-apart value nanowires adjacent to the key nanowires, and configurable switches between the key nanowires and the value nanowires. The combination further includes a key microwire-nanowire grid (key MNG) electrically connected to the spaced-apart key nanowires, and a value microwire-nanowire grid (value MNG) electrically connected to the spaced-apart value nanowires. A key or value MNGs selects multiple nanowires for a given key or value.

  1. Addressing the workforce pipeline challenge

    SciTech Connect

    Leonard Bond; Kevin Kostelnik; Richard Holman

    2006-11-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundations to enable future economic growth. To meet this goal the next generation energy workforce in the U.S., in particular those needed to support instrumentation, controls and advanced operations and maintenance, is a critical element. The workforce is aging and a new workforce pipeline, to support both current generation and new build has yet to be established. The paper reviews the challenges and some actions being taken to address this need.

  2. Identifying and Addressing Vaccine Hesitancy

    PubMed Central

    Kestenbaum, Lori A.; Feemster, Kristen A.

    2015-01-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as vaccine hesitant. This phenomenon has developed due to the confluence of multiple social, cultural, political and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance. PMID:25875982

  3. Identifying and addressing vaccine hesitancy.

    PubMed

    Kestenbaum, Lori A; Feemster, Kristen A

    2015-04-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as "vaccine hesitant." This phenomenon has developed due to the confluence of multiple social, cultural, political, and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance. PMID:25875982

  4. Scientific dishonesty and good scientific practice.

    PubMed

    Andersen, D; Axelsen, N H; Riis, P

    1993-04-01

    Scientific dishonesty has been the subject of much public interest in recent years. Although the problem has had a low profile in Denmark, there is no reason to believe that it is non-existent. Several preconditions known to be important prevail here as well as in other countries, such as pressure to publish and severe competition for research grants and senior academic positions. The Danish Medical Research Council (DMRC) decided to respond to this problem by preparing a report on scientific dishonesty with suggestions to the research institutions on rules for good scientific practice and procedures for investigation of suspected dishonesty. To this end, an investigatory system was suggested. The system should consist of two regional committees and one national committee. They should be headed by high court judges and experienced health sciences researchers as members. The committees will investigate cases reported to them and conclude on whether dishonesty has been established and on whether the scientific work should be retracted. Sanctions shall remain the task of the institutions. Preventive measures comprise open access to and a long storage period for scientific data. PMID:8495601

  5. Advances in Scientific Investigation and Automation.

    ERIC Educational Resources Information Center

    Abt, Jeffrey; And Others

    1987-01-01

    Six articles address: (1) the impact of science on the physical examination and treatment of books; (2) equipment for physical examination of books; (3) research using the cyclotron for historical analysis; (4) scientific analysis of paper and ink in early maps; (5) recent advances in automation; and (6) cataloging standards. (MES)

  6. Resource Materials on Scientific Integrity Issues.

    ERIC Educational Resources Information Center

    Macrina, Francis L., Ed.; Munro, Cindy L., Ed.

    1993-01-01

    The annotated bibliography contains 26 citations of books, monographs, and articles that may be useful to faculty and students in courses on scientific integrity. Topics addressed include ethical and legal considerations, fraud, technical writing and publication, intellectual property, notetaking, case study approach, conflict of interest, and…

  7. International scientific cooperation: past and future.

    NASA Astrophysics Data System (ADS)

    Roederer, J. G.

    1987-09-01

    This article addresses some non-scientific, yet no less significant, aspects of international cooperation in science, focuses on the social responsibility of the scientists engaged in cooperative research, and relates this to Marcel Nicolet's role in and contributions to international programs.

  8. Database Handling Software and Scientific Applications.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Discusses the general characteristics of database management systems and file systems. Also gives a basic framework for evaluating such software and suggests characteristics that should be considered when buying software for specific scientific applications. A list of vendor addresses for popular database management systems is included. (JN)

  9. Scientific review of great basin wildfire issues

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The University Nevada Reno, College of Agriculture and Resource Concepts Inc., co-sponsored a Great Basin Wildfire Forum in September 2007 to address a “Scientific Review of the Ecological and Management History of Great Basin Natural Resources and Recommendations to Achieve Ecosystem Restoration”. ...

  10. Scientific Review of Great Basin Wildfire Issues

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The University Nevada Reno, College of Agriculture and Resource Concepts Inc., co-sponsored a Great Basin Wildfire Forum in September 2007 to address a “Scientific Review of the Ecological and Management History of Great Basin Natural Resources and Recommendations to Achieve Ecosystem Restoration”. ...

  11. Connecting Scientific Reasoning and Causal Inference

    ERIC Educational Resources Information Center

    Kuhn, Deanna; Dean, David, Jr.

    2004-01-01

    Literature on multivariable causal inference (MCI) and literature on scientific reasoning (SR) have proceeded almost entirely independently, although they in large part address the same phenomena. An effort is made to bring these paradigms into close enough alignment with one another to compare implications of the two lines of work and examine how…

  12. A solution (data architecture) for handling time-series data - sensor data (4D), its visualisation and the questions around uncertainty of this data

    NASA Astrophysics Data System (ADS)

    Nayembil, Martin; Barkwith, Andrew

    2016-04-01

    Geo-environmental research is increasingly in the age of data-driven research. It has become necessary to collect, store, integrate and visualise more subsurface data for environmental research. The information required to facilitate data-driven research is often characterised by its variability, volume, complexity and frequency. This has necessitated the development of suitable data workflows, hybrid data architectures, and multiple visualisation solutions to provide the proper context to scientists and to enable their understanding of the different trends that the data displays for their many scientific interpolations. However this data, predominantly time-series (4D) acquired through sensors and being mostly telemetered, poses significant challenges/questions in quantifying the uncertainty of the data. To validate the research answers including the data methodologies, the following open questions around uncertainty will need addressing, i.e. uncertainty generated from: • the instruments used for data capture; • the transfer process of the data often from remote locations through telemetry; • the data processing techniques used for harmonising and integration from multiple sensor outlets; • the approximations applied to visualize such data from various conversion factors to include units standardisation The main question remains: How do we deal with the issues around uncertainty when it comes to the large and variable amounts of time-series data we collect, harmonise and visualise for the data-driven geo-environmental research that we undertake today?

  13. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  14. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-04-01

    Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  15. [Local knowledge and dilemmas related to validity and applicability of scientific knowledge in rural areas].

    PubMed

    Rozemberg, Brani

    2007-01-01

    Based on previous experience from two research projects on schistosomiasis in rural populations, this article focuses on the relations between scientific health knowledge and health-related common sense in farming communities. The article discusses factors that affect the meaning of participation by these communities in exogenous programs, as well as the dilemmas related to the appropriation, validity, and applicability of multiple and non- contextualized health information offered by such programs. The article discusses how the acritical aggregation of large amounts of information, a feature of globalization, deepens the feeling of uncertainty in rural communities and the trend to impute diseases to fatality. Meanwhile, the consumption of medical technologies is viewed as a symbol of progress and is highly valued by these groups. The discussion addresses the important role of health personnel in valuing local empirical knowledge, fostering the incorporation of useful technical knowledge without compromising the cultural heritage on which the identity and health of such groups are based. PMID:17308723

  16. Uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  17. Nanomedicine: Governing uncertainties

    NASA Astrophysics Data System (ADS)

    Trisolino, Antonella

    Nanomedicine is a promising and revolutionary field to improve medical diagnoses and therapies leading to a higher quality of life for everybody. Huge benefits are expected from nanomedicine applications such as in diagnostic and therapeutic field. However, nanomedicine poses several issues on risks to the human health. This thesis aims to defense a perspective of risk governance that sustains scientific knowledge process by developing guidelines and providing the minimum safety standards acceptable to protect the human health. Although nanomedicine is in an early stage of its discovery, some cautious measures are required to provide regulatory mechanisms able to response to the unique set of challenges associated to nanomedicine. Nanotechnology offers an unique opportunity to intensify a major interplay between different disciplines such as science and law. This multidisciplinary approach can positively contributes to find reliable regulatory choices and responsive normative tools in dealing with challenges of novel technologies.

  18. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  19. A review of uncertainty visualization within the IPCC reports

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Reusser, Dominik; Wrobel, Markus

    2015-04-01

    Results derived from climate model simulations confront non-expert users with a variety of uncertainties. This gives rise to the challenge that the scientific information must be communicated such that it can be easily understood, however, the complexity of the science behind is still incorporated. With respect to the assessment reports of the IPCC, the situation is even more complicated, because heterogeneous sources and multiple types of uncertainties need to be compiled together. Within this work, we systematically (1) analyzed the visual representation of uncertainties in the IPCC AR4 and AR5 reports, and (2) executed a questionnaire to evaluate how different user groups such as decision-makers and teachers understand these uncertainty visualizations. Within the first step, we classified visual uncertainty metaphors for spatial, temporal and abstract representations. As a result, we clearly identified a high complexity of the IPCC visualizations compared to standard presentation graphics, sometimes even integrating two or more uncertainty classes / measures together with the "certain" (mean) information. Further we identified complex written uncertainty explanations within image captions even within the "summary reports for policy makers". In the second step, based on these observations, we designed a questionnaire to investigate how non-climate experts understand these visual representations of uncertainties, how visual uncertainty coding might hinder the perception of the "non-uncertain" data, and if alternatives for certain IPCC visualizations exist. Within the talk/poster, we will present first results from this questionnaire. Summarizing, we identified a clear trend towards complex images within the latest IPCC reports, with a tendency to incorporate as much as possible information into the visual representations, resulting in proprietary, non-standard graphic representations that are not necessarily easy to comprehend on one glimpse. We conclude that

  20. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    SciTech Connect

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  1. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  2. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  3. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    SciTech Connect

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  4. Varieties of uncertainty in health care: a conceptual taxonomy

    PubMed Central

    Han, Paul K.J.; Klein, William M.P.; Arora, Neeraj K.

    2011-01-01

    Uncertainty is a pervasive and important problem that has attracted increasing attention in health care, given the growing emphasis on evidence-based medicine, shared decision making, and patient-centered care. However, our understanding of this problem is limited, due in part to the absence of a unified, coherent concept of uncertainty. There are multiple meanings and varieties of uncertainty in health care, which are not often distinguished or acknowledged although each may have unique effects or warrant different courses of action. The literature on uncertainty in health care is thus fragmented, and existing insights have been incompletely translated to clinical practice. In this paper we attempt to address this problem by synthesizing diverse theoretical and empirical literature from the fields of communication, decision science, engineering, health services research, and psychology, and developing a new integrative conceptual taxonomy of uncertainty. We propose a three-dimensional taxonomy that characterizes uncertainty in health care according to its fundamental sources, issues, and locus. We show how this new taxonomy facilitates an organized approach to the problem of uncertainty in health care by clarifying its nature and prognosis, and suggesting appropriate strategies for its analysis and management. PMID:22067431

  5. Knowledge, consensus and uncertainty.

    PubMed

    Cavell, M

    1999-12-01

    Some months ago the editors of this journal asked me if I would undertake a series of short entries of a general sort on philosophical topics germane to current discussions in psychoanalysis. Both authors and topics were left to my discretion. I thought the series was a good idea and gladly agreed to do it. To my surprise and pleasure, all the philosophers I invited accepted I am only sorry that the series could not be longer as there are other philosophers as well who would have been splendid participants, and other topics I would like to have addressed. The essays that will follow in subsequent issues represent by and large the tradition of analytic philosophy, though this has come in the last few decades to comprise many of the themes we used to associate with the Continental tradition. Future entries, by James Conant, Donald Davison, Pascal Engel, Dagfinn Føllesdal, James Hopkins, Ernest Le Pore, Jeffrey Malpas, Jerome Neu, Brian O'Shaughnessy, Richard Rorty and Richard Wollheim, will address the following topics: intersubjectivity, meaning and language, consciousness and perception, pragmatism, knowledge and belief, norms and nature, metaphor, hermeneutics, truth, self-deception, the emotions. The essay below on knowledge, which will also be the topic of another entry by a different author later on, is the only one in the series that I will write. PMID:10669971

  6. Assessing uncertainties in GHG emission estimates from Canada's oil sands developments

    NASA Astrophysics Data System (ADS)

    Kim, M. G.; Lin, J. C.; Huang, L.; Edwards, T. W.; Worthy, D.; Wang, D. K.; Sweeney, C.; White, J. W.; Andrews, A. E.; Bruhwiler, L.; Oda, T.; Deng, F.

    2013-12-01

    Reducing uncertainties in projections of surface emissions of CO2 and CH4 relies on continuously improving our scientific understanding of the exchange processes between the atmosphere and land at regional scales. In order to enhance our understanding in emission processes and atmospheric transports, an integrated framework that addresses individual natural and anthropogenic factors in a complementary way proves to be invaluable. This study presents an example of top-down inverse modeling that utilizes high precision measurement data collected at a Canadian greenhouse gas monitoring site. The measurements include multiple tracers encompassing standard greenhouse gas species, stable isotopes of CO2, and combustion-related species. The potential for the proposed analysis framework is demonstrated using Stochastic Time-Inverted Lagrangian Transport (STILT) model runs to yield a unique regional-scale constraint that can be used to relate the observed changes of tracer concentrations to the processes in their upwind source regions. The uncertainties in emission estimates are assessed using different transport fields and background concentrations coupled with the STILT model. Also, methods to further reduce uncertainties in the retrieved emissions by incorporating additional constraints including tracer-to-tracer correlations and satellite measurements are briefly discussed. The inversion approach both reproduces source areas in a spatially explicit way through sophisticated Lagrangian transport modeling and infers emission processes that leave imprints on atmospheric tracers. The results indicate that the changes in greenhouse gas concentration are strongly influenced by regional sources, including significant contributions from fossil fuel emissions, and that the integrated approach can be used for regulatory regimes to verify reported emissions of the greenhouse gas from oil sands developments.

  7. Scientific integrity memorandum

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2009-03-01

    U.S. President Barack Obama signed a presidential memorandum on 9 March to help restore scientific integrity in government decision making. The memorandum directs the White House Office of Science and Technology Policy to develop a strategy within 120 days that ensures that "the selection of scientists and technology professionals for science and technology positions in the executive branch is based on those individuals' scientific and technological knowledge, credentials, and experience; agencies make available to the public the scientific or technological findings or conclusions considered or relied upon in policy decisions; agencies use scientific and technological information that has been subject to well-established scientific processes such as peer review; and agencies have appropriate rules and procedures to ensure the integrity of the scientific process within the agency, including whistleblower protection."

  8. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  9. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  10. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  11. Scientific Journalism in Armenia

    NASA Astrophysics Data System (ADS)

    Farmanyan, S. V.; Mickaelian, A. M.

    2015-07-01

    In the present study, the problems of scientific journalism and activities of Armenian science journalists are presented. Scientific journalism in the world, forms of its activities, Armenian Astronomical Society (ArAS) press-releases and their subjects, ArAS website "Mass Media News" section, annual and monthly calendars of astronomical events, and "Astghagitak" online journal are described. Most interesting astronomical subjects involved in scientific journalism, reasons for non-satisfactory science outreach and possible solutions are discussed.

  12. Addressing viral resistance through vaccines

    PubMed Central

    Laughlin, Catherine; Schleif, Amanda; Heilman, Carole A

    2015-01-01

    Antimicrobial resistance is a serious healthcare concern affecting millions of people around the world. Antiviral resistance has been viewed as a lesser threat than antibiotic resistance, but it is important to consider approaches to address this growing issue. While vaccination is a logical strategy, and has been shown to be successful many times over, next generation viral vaccines with a specific goal of curbing antiviral resistance will need to clear several hurdles including vaccine design, evaluation and implementation. This article suggests that a new model of vaccination may need to be considered: rather than focusing on public health, this model would primarily target sectors of the population who are at high risk for complications from certain infections. PMID:26604979

  13. Addressing failures in exascale computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, Jim; Bose, Pradip; Cappello, Franck; Carlson, William; Chien, Andrew A.; Coteus, Paul; Debardeleben, Nathan A.; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Saverio, Fazzari; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Robert; Stearly, Jon; Van Hensbergen, Eric

    2014-05-01

    We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  14. Light addressable photoelectrochemical cyanide sensor

    SciTech Connect

    Licht, S.; Myung, N.; Sun, Y.

    1996-03-15

    A sensor is demonstrated that is capable of spatial discrimination of cyanide with use of only a single stationary sensing element. Different spatial regions of the sensing element are light activated to reveal the solution cyanide concentration only at the point of illumination. In this light addressable photoelectrochemical (LAP) sensor the sensing element consists of an n-CdSe electrode immersed in solution, with the open-circuit potential determined under illumination. In alkaline ferro-ferri-cyanide solution, the open-circuit photopotential is highly responsive to cyanide, with a linear response of (120 mV) log [KCN]. LAP detection with a spatial resolution of {+-}1 mm for cyanide detection is demonstrated. The response is almost linear for 0.001-0.100 m cyanide with a resolution of 5 mV. 38 refs., 7 figs., 1 tab.

  15. Addressing Failures in Exascale Computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert; Abraham, Jacob; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, J.; Bose, Pradip; Cappello, Franck; Carlson, Bill; Chien, Andrew; Coteus, Paul; DeBardeleben, Nathan; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Fazzari, Saverio; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Rob; Stearley, Jon; Van Hensbergen, Eric

    2014-01-01

    We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  16. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  17. Mars Returned Sample Science: Scientific Planning Related to Sample Quality

    NASA Astrophysics Data System (ADS)

    Beaty, D. W.; Liu, Y.; Borg, L. E.; Herd, C. D. K.; McLennan, S. M.; Allen, C. C.; Bass, D. S.; Farley, K. A.; Mattingly, R. L.

    2014-07-01

    We have evaluated the set of measurements central to addressing the science goals for MSR, and developed a list of the factors that would affect the usefulness of the samples for scientific investigations.

  18. Life and Scientific Work of Peter Guthrie Tait

    NASA Astrophysics Data System (ADS)

    Gilston Knott, Cargill

    2015-04-01

    Preface; 1. Memoir - Peter Guthrie Tait; 2. Experimental work; 3. Mathematical work; 4. Quaternions; 5. Thomson and Tait, 'Tand T', or Thomson and Tait's natural philosophy; 6. Other books; 7. Addresses, reviews, and correspondence; 8. Popular scientific articles; Bibliography; Index.

  19. Investments in energy technological change under uncertainty

    NASA Astrophysics Data System (ADS)

    Shittu, Ekundayo

    2009-12-01

    This dissertation addresses the crucial problem of how environmental policy uncertainty influences investments in energy technological change. The rising level of carbon emissions due to increasing global energy consumption calls for policy shift. In order to stem the negative consequences on the climate, policymakers are concerned with carving an optimal regulation that will encourage technology investments. However, decision makers are facing uncertainties surrounding future environmental policy. The first part considers the treatment of technological change in theoretical models. This part has two purposes: (1) to show--through illustrative examples--that technological change can lead to quite different, and surprising, impacts on the marginal costs of pollution abatement. We demonstrate an intriguing and uncommon result that technological change can increase the marginal costs of pollution abatement over some range of abatement; (2) to show the impact, on policy, of this uncommon observation. We find that under the assumption of technical change that can increase the marginal cost of pollution abatement over some range, the ranking of policy instruments is affected. The second part builds on the first by considering the impact of uncertainty in the carbon tax on investments in a portfolio of technologies. We determine the response of energy R&D investments as the carbon tax increases both in terms of overall and technology-specific investments. We determine the impact of risk in the carbon tax on the portfolio. We find that the response of the optimal investment in a portfolio of technologies to an increasing carbon tax depends on the relative costs of the programs and the elasticity of substitution between fossil and non-fossil energy inputs. In the third part, we zoom-in on the portfolio model above to consider how uncertainty in the magnitude and timing of a carbon tax influences investments. Under a two-stage continuous-time optimal control model, we

  20. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  1. Reformulating the Quantum Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  2. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. PMID:21489684

  3. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  4. WELCOME ADDRESS: Welcome Address for the 60th Yamada Conference

    NASA Astrophysics Data System (ADS)

    Fukuyama, Hidetoshi

    2006-12-01

    discussions in the conference rooms but also through pleasant chatting on the lobby floor or even at the banquet table, which may give rise to other important international collaborative projects in the near future. Finally, we would like to express our sincere thanks to Professor Mitsuhiro Motokawa and the members of the organizing committee who have made every effort to bring in such a successful performance of the Conference. I hope all of you would enjoy the Conference and relax sometime staying in this interesting scientific city of Sendai. Thank you. Hidetoshi Fukuyama On behalf of Director General, Professor Yasusada Yamada Yamada Science Foundation

  5. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  6. Sensitivity and uncertainty analysis for the annual P loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  7. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  8. Estimating the magnitude of prediction uncertainties for field-scale P loss models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, an uncertainty analysis for the Annual P Loss Estima...

  9. Sensitivity and uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  10. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  11. Treatment of uncertainties associated with PRAs in risk-informed decision making (NUREG1855).

    SciTech Connect

    Wheeler, Timothy A.

    2010-06-01

    This document provides guidance on how to treat uncertainties associated with probabilistic risk assessment (PRA) in risk-informed decisionmaking. The objectives of this guidance include fostering an understanding of the uncertainties associated with PRA and their impact on the results of PRA and providing a pragmatic approach to addressing these uncertainties in the context of the decisionmaking. In implementing risk-informed decisionmaking, the U.S. Nuclear Regulatory Commission expects that appropriate consideration of uncertainty will be given in the analyses used to support the decision and in the interpretation of the findings of those analyses. To meet the objective of this document, it is necessary to understand the role that PRA results play in the context of the decision process. To define this context, this document provides an overview of the risk-informed decisionmaking process itself. With the context defined, this document describes the characteristics of a risk model and, in particular, a PRA. This description includes recognition that a PRA, being a probabilistic model, characterizes aleatory uncertainty that results from randomness associated with the events of the model. Because the focus of this document is epistemic uncertainty (i.e., uncertainties in the formulation of the PRA model), it provides guidance on identifying and describing the different types of sources of epistemic uncertainty and the different ways that they are treated. The different types of epistemic uncertainty are parameter, model, and completeness uncertainties. The final part of the guidance addresses the uncertainty in PRA results in the context of riskinformed decisionmaking and, in particular, the interpretation of the results of the uncertainty analysis when comparing PRA results with the acceptance criteria established for a specified application. In addition, guidance is provided for addressing completeness uncertainty in risk-informed decision making. Such

  12. Scientific Opinion on Risk Assessment of Synthetic Biology.

    PubMed

    Epstein, Michelle M; Vermeire, Theo

    2016-08-01

    In 2013, three Scientific Committees of the European Commission (EC) drafted Scientific Opinions on synthetic biology that provide an operational definition and address risk assessment methodology, safety aspects, environmental risks, knowledge gaps, and research priorities. These Opinions contribute to the international discussions on the risk governance for synthetic biology developments. PMID:27234301

  13. Analogy and Intersubjectivity: Political Oratory, Scholarly Argument and Scientific Reports.

    ERIC Educational Resources Information Center

    Gross, Alan G.

    1983-01-01

    Focuses on the different ways political oratory, scholarly argument, and scientific reports use analogy. Specifically, analyzes intersubjective agreement in Franklin D. Roosevelt's First Inaugural address, the scholarly argument between Sir Karl Popper and Thomas S. Kuhn, and the scientific reports of various mathematicians and scientists. (PD)

  14. MeshTV: scientific visualization and graphical analysis software

    SciTech Connect

    Brugger, E S; Roberts, L; Wookey, S G

    1999-02-08

    The increasing data complexity engendered by the Accelerated Scientific Computing Initiative (ASCI) requires more capability in our scientific visualization software. B Division at Lawrence Livermore National Laboratory (LLNL) addresses these new and changing requirements with MeshTV. We began work on MeshTV around eight years ago, and have progressively refined the software to provide improved scientific analysis and visualization to well over 100 users at Liver-more, Los Alamos, Sandia, and in private industry. (U)

  15. Flexible Scientific Workflow Modeling Using Frames, Templates, and Dynamic Embedding

    SciTech Connect

    Ngu, Anne Hee Hiong; Bowers, Shawn; Haasch, Nicholas; McPhillips, Timothy; Critchlow, Terence J.

    2008-07-30

    While most scientific workflows systems are based on dataflow, some amount of control-flow modeling is often necessary for engineering fault-tolerant, robust, and adaptive workflows. However, control-flow modeling within dataflow often results in workflow specifications that are hard to comprehend, reuse, and maintain. We describe new modeling constructs to address these issues that provide a structured approach for modeling control-flow within scientific workflows, and discuss their implementation within the Kepler scientific workflow system.

  16. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  17. Sources of Uncertainty in Climate Change Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Gutmann, Ethan; Clark, Martyn; Eidhammer, Trude; Ikeda, Kyoko; Deser, Clara; Brekke, Levi; Arnold, Jeffrey; Rasmussen, Roy

    2016-04-01

    Predicting the likely changes in precipitation due to anthropogenic climate influences is one of the most important problems in earth science today. This problem is complicated by the enormous uncertainty in current predictions. Until all such sources of uncertainty are adequately addressed and quantified, we can not know what changes may be predictable, and which masked by the internal variability of the climate system itself. Here we assess multiple sources of uncertainty including those due to internal variability, climate model selection, emissions scenario, regional climate model physics, and statistical downscaling methods. This work focuses on the Colorado Rocky Mountains because these mountains serve as the water towers for much of the western United States, but the results are more broadly applicable, and results will be presented covering the Columbia River Basin and the California Sierra Nevadas as well. Internal variability is assessed using 30 members of the CESM Large Ensemble. Uncertainty due to the choice of climate models is assessed using 100 climate projections from the CMIP5 archive, including multiple emissions scenarios. Uncertainty due to regional climate model physics is assessed using a limited set of high-resolution Weather Research and Forecasting (WRF) model simulations in comparison to a larger multi-physics ensemble using the Intermediate Complexity Atmospheric Research (ICAR) model. Finally, statistical downscaling uncertainty is assessed using multiple statistical downscaling models. In near-term projections (25-35 years) internal variability is the largest source of uncertainty; however, over longer time scales (70-80 years) other sources of uncertainty become more important, with the importance of different sources of uncertainty varying depending on the metric assessed.

  18. Quantifying Uncertainty in Velocity Models using Bayesian Methods

    NASA Astrophysics Data System (ADS)

    Hobbs, R.; Caiado, C.; Majdański, M.

    2008-12-01

    Quanitifying uncertainty in models derived from observed data is a major issue. Public and political understanding of uncertainty is poor and for industry inadequate assessment of risk costs money. In this talk we will examine the geological structure of the subsurface, however our principal exploration tool, controlled source seismology, gives its data in time. Inversion tools exist to map these data into a depth model but a full exploration of the uncertainty of the model is rarely done because robust strategies do not exist for large non-linear complex systems. There are two principal sources of uncertainty: the first comes from the input data which is noisy and bandlimited; the second, and more sinister, is from the model parameterisation and forward algorithms themselves, which approximate to the physics to make the problem tractable. To address these issues we propose a Bayesian approach. One philosophy is to estimate the uncertainty in a possible model derived using standard inversion tools. During the inversion stage we can use our geological prejudice to derive an acceptable model. Then we use a local random walk using the Metropolis- Hastings algorithm to explore the model space immediately around a possible solution. For models with a limited number of parameters we can use the forward modeling step from the inversion code. However as the number of parameters increase and/or the cost of the forward modeling step becomes significant, we need to use fast emulators to act as proxies so a sufficient number of iterations can be performed on which to base our statistical measures of uncertainty. In this presentation we show examples of uncertainty estimation using both pre- and post-critical seismic data. In particular, we will demonstrate uncertainty introduced by the approximation of the physics by using a tomographic inversion of bandlimited data and show that uncertainty increases as the central frequency of the data decreases. This is consistent with the

  19. Scientific Ability and Creativity

    ERIC Educational Resources Information Center

    Heller, Kurt A.

    2007-01-01

    Following an introductory definition of "scientific ability and creativity", product-oriented, personality and social psychological approaches to studying scientific ability are examined with reference to competence and performance. Studies in the psychometric versus cognitive psychological paradigms are dealt with in more detail. These two…

  20. Scientific rigor through videogames.

    PubMed

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. PMID:25300714

  1. Age and Scientific Performance.

    ERIC Educational Resources Information Center

    Cole, Stephen

    1979-01-01

    The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)

  2. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  3. Addressing neurological disorders with neuromodulation.

    PubMed

    Oluigbo, Chima O; Rezai, Ali R

    2011-07-01

    Neurological disorders are becoming increasingly common in developed countries as a result of the aging population. In spite of medications, these disorders can result in progressive loss of function as well as chronic physical, cognitive, and emotional disability that ultimately places enormous emotional and economic on the patient, caretakers, and the society in general. Neuromodulation is emerging as a therapeutic option in these patients. Neuromodulation is a field, which involves implantable devices that allow for the reversible adjustable application of electrical, chemical, or biological agents to the central or peripheral nervous system with the objective of altering its functioning with the objective of achieving a therapeutic or clinically beneficial effect. It is a rapidly evolving field that brings together many different specialties in the fields of medicine, materials science, computer science and technology, biomedical, and neural engineering as well as the surgical or interventional specialties. It has multiple current and emerging indications, and an enormous potential for growth. The main challenges before it are in the need for effective collaboration between engineers, basic scientists, and clinicians to develop innovations that address specific problems resulting in new devices and clinical applications. PMID:21193369

  4. Gender: addressing a critical focus.

    PubMed

    Thornton, L; Wegner, M N

    1995-01-01

    The definition of gender was addressed at the Fourth World Conference on Women (Beijing, China). After extensive debate, the definition developed by the UN Population Fund in 1995 was adopted: "a set of qualities and behaviors expected from a female or male by society." The sustainability of family planning (FP) programs depends on acknowledgment of the role gender plays in contraceptive decision-making and use. For example, programs must consider the fact that women in many cultures do not make FP decisions without the consent of their spouse. AVSC is examining providers' gender-based ideas about clients and the effects of these views on the quality of reproductive health services. Questions such as how service providers can encourage joint responsibility for contraception without requiring spousal consent or how they can make men feel comfortable about using a male method in a society where FP is considered a woman's issue are being discussed. Also relevant is how service providers can discuss sexual matters openly with female clients in cultures that do not allow women to enjoy their sexuality. Another concern is the potential for physical violence to a client as a result of the provision of FP services. PMID:12294397

  5. UncertWeb: chaining web services accounting for uncertainty

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Jones, Richard; Bastin, Lucy; Williams, Matthew; Pebesma, Edzer; Nativi, Stefano

    2010-05-01

    The development of interoperable services that permit access to data and processes, typically using web service based standards opens up the possibility for increasingly complex chains of data and processes, which might be discovered and composed in increasingly automatic ways. This concept, sometimes referred to as the "Model Web", offers the promise of integrated (Earth) system models, with pluggable web service based components which can be discovered, composed and evaluated dynamically. A significant issue with such service chains, indeed in any composite model composed of coupled components, is that in all interesting (non-linear) cases the effect of uncertainties on inputs, or components within the chain will have complex, potentially unexpected effects on the outputs. Within the FP7 UncertWeb project we will be developing a mechanism and an accompanying set of tools to enable rigorous uncertainty management in web based service chains involving both data and processes. The project will exploit and extend the UncertML candidate standard to flexibly propagate uncertainty through service chains, including looking at mechanisms to develop uncertainty enabled profiles of existing Open Geospatial Consortium services. To facilitate the use of such services we will develop tools to address the definition of the input uncertainties (elicitation), manage the uncertainty propagation (emulation), undertake uncertainty and sensitivity analysis and visualise the output uncertainty. In this talk we will outline the challenges of the UncertWeb project, illustrating this with a prototype service chain we have created for correcting station level pressure to sea-level pressure, which accounts for the various uncertainties involved. In particular we will discuss some of the challenges of chaining Open Geospatial Consortium services using the Business Process Execution Language. We will also address the issue of computational cost and communication bandwidth requirements for

  6. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    SciTech Connect

    Phipps, Eric T.; Eldred, Michael S.; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  7. The effects of uncertainty on the analysis of atmospheric deposition

    SciTech Connect

    Bloyd, C.N. ); Small, M.J.; Henrion, M.; Rubin, E.S. )

    1988-01-01

    Research efforts on the problem of acid ran are directed at improving current scientific understanding in critical areas, including sources of precursor emissions, the transport and transformation of pollutants in the atmosphere, the deposition of acidic species, and the chemical and biological effects of acid deposition on aquatic systems, materials, forests, crops and human health. The general goal of these research efforts is to characterize the current situation and to develop analytical models which can be used to predict the response of various systems to changes in critical parameters. This paper describes a framework which enables one to characterize uncertainty at each major stage of the modeling process. Following a general presentation of the modeling framework, a description is given of the methods chosen to characterize uncertainty for each major step. Analysis is then performed to illustrate the effects of uncertainty on future lake acidification in the Adirondacks Park area of upstate New York.

  8. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  9. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    SciTech Connect

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.

  10. Communicating Uncertainties for Microwave-Based ESDRs

    NASA Astrophysics Data System (ADS)

    Wentz, F. J.; Mears, C. A.; Smith, D. K.

    2011-12-01

    Currently as part of NASA's MEaSUREs program, there is a 25-year archive of consistently-processed and carefully inter-calibrated Earth Science Data Records (ESDR) consisting of geophysical products derived from satellite microwave radiometers. These products include ocean surface temperature and wind speed, total atmospheric water vapor and cloud water, surface rain rate, and deep-layer averages of atmospheric temperature. The product retrievals are based on a radiative transfer model (RTM) for the surface and intervening atmosphere. Thus, the accuracy of the retrieved products depends both on the accuracy of the RTM, the accuracy of the measured brightness temperatures that serve as inputs to the retrieval algorithm, and on the accuracy of any ancillary data used to adjust for unmeasured geophysical conditions. In addition, for gridded products that are averages over time or space, sampling error can become important. It is important not only to calculate the uncertainties associated with the ESDRs but also to effectively communicate these uncertainties to the Users in a way that is helpful for their particular set of applications. This is a challenging task that will require a multi-faceted approach consisting of (1) error bars assigned to each retrieval, (2) detailed interactive validation reports, and (3) peer-reviewed scientific papers on long-term trends. All of this information needs to be linked to the ESDR's in a manner that facilitates integration into the User's applications. Our talk will discuss the progress we are making in implementing these approaches.

  11. Complexity and Uncertainty in Soil Nitrogen Modeling

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Gu, C.

    2009-12-01

    Model uncertainty is rarely considered in the field of biogeochemical modeling. The standard biogeochemical modeling approach is to proceed based on one selected model with the “right” complexity level based on data availability. However other plausible models can result in dissimilar answer to the scientific question in hand using the same set of data. Relying on a single model can lead to underestimation of uncertainty associated with the results and therefore lead to unreliable conclusions. Multi-model ensemble strategy is a means to exploit the diversity of skillful predictions from different models with multiple levels of complexity. The aim of this study is two fold, first to explore the impact of a model’s complexity level on the accuracy of the end results and second to introduce a probabilistic multi-model strategy in the context of a process-based biogeochemical model. We developed three different versions of a biogeochemical model, TOUGHREACT-N, with various complexity levels. Each one of these models was calibrated against the observed data from a tomato field in Western Sacramento County, California, and considered two different weighting sets on the objective function. This way we created a set of six ensemble members. The Bayesian Model Averaging (BMA) approach was then used to combine these ensemble members by the likelihood that an individual model is correct given the observations. The results clearly indicate the need to consider a multi-model ensemble strategy over a single model selection in biogeochemical modeling.

  12. Managing uncertainty in family practice.

    PubMed Central

    Biehn, J.

    1982-01-01

    Because patients present in the early stages of undifferentiated problems, the family physician often faces uncertainty, especially in diagnosis and management. The physician's uncertainty may be unacceptable to the patient and may lead to inappropriate use of diagnostic procedures. The problem is intensified by the physician's hospital training, which emphasizes mastery of available knowledge and decision-making based on certainty. Strategies by which a physician may manage uncertainty include (a) a more open doctor-patient relationship, (b) understanding the patient's reason for attending the office, (c) a thorough assessment of the problem, (d) a commitment to reassessment and (e) appropriate consultation. PMID:7074488

  13. Quantum Cryptography Without Quantum Uncertainties

    NASA Astrophysics Data System (ADS)

    Durt, Thomas

    2002-06-01

    Quantum cryptography aims at transmitting a random key in such a way that the presence of a spy eavesdropping the communication would be revealed by disturbances in the transmission of the message. In standard quantum cryptography, this unavoidable disturbance is a consequence of the uncertainty principle of Heisenberg. We propose in this paper to replace quantum uncertainties by generalised, technological uncertainties, and discuss the realisability of such an idea. The proposed protocol can be considered as a simplification, but also as a generalisation of the standard quantum cryptographic protocols.

  14. Assessing uncertainty in physical constants

    NASA Astrophysics Data System (ADS)

    Henrion, Max; Fischhoff, Baruch

    1986-09-01

    Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.

  15. Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'

    NASA Astrophysics Data System (ADS)

    Casola, J. H.; Huber, D.

    2013-12-01

    Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision

  16. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix

  17. The representation of uncertainty in medical expert systems.

    PubMed

    Hughes, C

    1989-01-01

    The development of the rule-based expert system has provided important new techniques for the representation of knowledge. However, continued use of this representational scheme has highlighted some of its deficiencies. In particular, many within scientific and non-scientific fields attempting to use the rule-base design to describe natural phenomena often find it difficult to represent the complexities of the world as 'absolute' rules. For this reason, many investigators acknowledge the need to add an uncertainty mechanism to the rule-base construct. Such a facility would allow the quantification of accuracy or strength of association within individual rules Although agreement exists on the need for an uncertainty representation facility, the debate concerning the most appropriate methodology is far from resolved. The purpose of this paper is to provide a review and commentary on the current state of debate over the five most popular candidate uncertainty models: symbolic representation, MYCIN certainty factors, Bayesian, Dempster-Shafer and fuzzy set logic. The advantages and disadvantages of each uncertainty calculi will be presented and assessed with respect to their applicability to the medical expert systems domain. PMID:2695713

  18. Preliminary assessment of the impact of conceptual model uncertainty on site performance

    SciTech Connect

    Gallegos, D.P.; Pohl, P.I.; Olague, N.E.; Knowlton, R.G.; Updegraff, C.D.

    1990-10-01

    The US Department of Energy is responsible for the design, construction, operation, and decommission of a site for the deep geologic disposal of high-level radioactive waste (HLW). This involves site characterization and the use of performance assessment to demonstrate compliance with regulations for HLW disposal from the US Environmental Protection Agency (EPA) and the US Nuclear Regulatory Commission. The EPA standard states that a performance assessment should consider the associated uncertainties involved in estimating cumulative release of radionuclides to the accessible environment. To date, the majority of the efforts in uncertainty analysis have been directed toward data and parameter uncertainty, whereas little effort has been made to treat model uncertainty. Model uncertainty includes conceptual model uncertainty, mathematical model uncertainty, and any uncertainties derived from implementing the mathematical model in a computer code. Currently there is no systematic approach that is designed to address the uncertainty in conceptual models. The purpose of this investigation is to take a first step at addressing conceptual model uncertainty. This will be accomplished by assessing the relative impact of alternative conceptual models on the integrated release of radionuclides to the accessible environment for an HLW repository site located in unsaturated, fractured tuff. 4 refs., 2 figs.

  19. Relational grounding facilitates development of scientifically useful multiscale models

    PubMed Central

    2011-01-01

    We review grounding issues that influence the scientific usefulness of any biomedical multiscale model (MSM). Groundings are the collection of units, dimensions, and/or objects to which a variable or model constituent refers. To date, models that primarily use continuous mathematics rely heavily on absolute grounding, whereas those that primarily use discrete software paradigms (e.g., object-oriented, agent-based, actor) typically employ relational grounding. We review grounding issues and identify strategies to address them. We maintain that grounding issues should be addressed at the start of any MSM project and should be reevaluated throughout the model development process. We make the following points. Grounding decisions influence model flexibility, adaptability, and thus reusability. Grounding choices should be influenced by measures, uncertainty, system information, and the nature of available validation data. Absolute grounding complicates the process of combining models to form larger models unless all are grounded absolutely. Relational grounding facilitates referent knowledge embodiment within computational mechanisms but requires separate model-to-referent mappings. Absolute grounding can simplify integration by forcing common units and, hence, a common integration target, but context change may require model reengineering. Relational grounding enables synthesis of large, composite (multi-module) models that can be robust to context changes. Because biological components have varying degrees of autonomy, corresponding components in MSMs need to do the same. Relational grounding facilitates achieving such autonomy. Biomimetic analogues designed to facilitate translational research and development must have long lifecycles. Exploring mechanisms of normal-to-disease transition requires model components that are grounded relationally. Multi-paradigm modeling requires both hyperspatial and relational grounding. PMID:21951817

  20. OPENING ADDRESS: Heterostructures in Semiconductors

    NASA Astrophysics Data System (ADS)

    Grimmeiss, Hermann G.

    1996-01-01

    Good morning, Gentlemen! On behalf of the Nobel Foundation, I should like to welcome you to the Nobel Symposium on "Heterostructures in Semiconductors". It gives me great pleasure to see so many colleagues and old friends from all over the world in the audience and, in particular, to bid welcome to our Nobel laureates, Prof. Esaki and Prof. von Klitzing. In front of a different audience I would now commend the scientific and technological importance of heterostructures in semiconductors and emphatically emphasise that heterostructures, as an important contribution to microelectronics and, hence, information technology, have changed societies all over the world. I would also mention that information technology is one of the most important global key industries which covers a wide field of important areas each of which bears its own character. Ever since the invention of the transistor, we have witnessed a fantastic growth in semiconductor technology, leading to more complex functions and higher densities of devices. This development would hardly be possible without an increasing understanding of semiconductor materials and new concepts in material growth techniques which allow the fabrication of previously unknown semiconductor structures. But here and today I will not do it because it would mean to carry coals to Newcastle. I will therefore not remind you that heterostructures were already suggested and discussed in detail a long time before proper technologies were available for the fabrication of such structures. Now, heterostructures are a foundation in science and part of our everyday life. Though this is certainly true, it is nevertheless fair to say that not all properties of heterostructures are yet understood and that further technologies have to be developed before a still better understanding is obtained. The organisers therefore hope that this symposium will contribute not only to improving our understanding of heterostructures but also to opening new

  1. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an

  2. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  3. Visualizing uncertainty about the future.

    PubMed

    Spiegelhalter, David; Pearson, Mike; Short, Ian

    2011-09-01

    We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge. PMID:21903802

  4. Administrative automation in a scientific environment

    NASA Technical Reports Server (NTRS)

    Jarrett, J. R.

    1984-01-01

    Although the scientific personnel at GSFC were advanced in the development and use of hardware and software for scientific applications, resistance to the use of automation or purchase of terminals, software and services, specifically for administrative functions was widespread. The approach used to address problems and constraints and plans for administrative automation within the Space and Earth Sciences Directorate are delineated. Accomplishments thus far include reduction of paperwork and manual efforts; improved communications through telemail and committees; additional support staff; increased awareness at all levels on ergonomic concerns and the need for training; better equipment; improved ADP skills through experience; management commitment; and an overall strategy for automating.

  5. An address geocoding solution for Chinese cities

    NASA Astrophysics Data System (ADS)

    Zhang, Xuehu; Ma, Haoming; Li, Qi

    2006-10-01

    We introduce the challenges of address geocoding for Chinese cities and present a potential solution along with a prototype system that deal with these challenges by combining and extending current geocoding solutions developed for United States and Japan. The proposed solution starts by separating city addresses into "standard" addresses which meet a predefined address model and non-standard ones. The standard addresses are stored in a structured relational database in their normalized forms, while a selected portion of the non-standard addresses are stored as aliases to the standard addresses. An in-memory address index is then constructed from the address database and serves as the basis for real-time address matching. Test results were obtained from two trials conducted in the city Beijing. On average 80% matching rate were achieved. Possible improvements to the current design are also discussed.

  6. Uncertainty-induced quantum nonlocality

    NASA Astrophysics Data System (ADS)

    Wu, Shao-xiong; Zhang, Jun; Yu, Chang-shui; Song, He-shan

    2014-01-01

    Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2×d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.

  7. Dynamical Realism and Uncertainty Propagation

    NASA Astrophysics Data System (ADS)

    Park, Inkwan

    In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates

  8. Natural Uncertainty Measure for Forecasting Floods in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Mantilla, Ricardo; Krajewski, Witold F.; Gupta, Vijay K.; Ayalew, Tibebu B.

    2015-04-01

    Recent data analysis have shown that peak flows for individual Rainfall-Runoff (RF-RO) events exhibit power law scaling with respect to drainage area, but the scaling slopes and intercepts change from one event to the next. We test this feature in the 32,400 km2 Iowa River basin, and give supporting evidence for our hypothesis that scaling slope and intercept incorporates all the pertinent physical processes that produce floods. These developments serve as the foundations for the key question that is addressed here: How to define uncertainty bounds for flood prediction for each event? We theoretically introduce the concept of Natural Uncertainty Measure for peak discharge (NUMPD) and test it using data from the Iowa River basin. We conjecture that NUMPD puts a limit to predictive uncertainty using measurements and modeling. In other words, the best any amount of data collection combined with any model can do is to come close to predicting NUMPD, but it cannot match or reduce it any further. For the applications of flood predictions, the concepts of Type-I and Type-II uncertainties in flood prediction are explained. We demonstrate Type-I uncertainty using the concept of NUMPD. Our results offer a context for Type-II uncertainty. Our results make a unique contribution to International Association of Hydrologic Sciences (IAHS) decade-long initiative on Predictions in Unaguged Basins (PUB) (2003-2012).

  9. Uncertainty in measurements by counting

    NASA Astrophysics Data System (ADS)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  10. Uncertainty of empirical correlation equations

    NASA Astrophysics Data System (ADS)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  11. Addressing the Public About Science and Religion

    NASA Astrophysics Data System (ADS)

    Peshkin, Murray

    2010-03-01

    Attacks on the integrity of science teaching in our public schools have recently become increasingly threatening. Geology and Darwinian evolution are the primary targets and cosmology is at risk. Up to now, the Supreme Court has excluded teachings based on religion from public schools for constitutional, not scientific, reasons. But now the incumbent Supreme Court seem less committed to strict separation of church and state than were their predecessors, and federal courts are beginning to judge the science itself. In this situation, we need to create a climate of public opinion favorable to the protection of good science by explaining the issues both to students and to others. I have been trying to do that by addressing audiences such as church groups, other community groups, and high school and college classes. I do not seek to convert committed anti-evolutionists. I am trying to inform the reasonable majority who do not really know what science is and does, or what a theory is and how we know when it's right, or why we tell them that all knowledge is provisional but still insist that we are teaching the right science. Many have been advised by their religious teachers that there is no conflict between science and their religious beliefs but do not see how that can be. I try to explain how they are disjoint discussions. I also discuss the likely consequences for our country if we degrade the teaching of science in the public schools. My audiences have generally been receptive. Here I will relate some lessons I have learned from my experience with such talks. Without doubt, the most important lesson is that most Americans have religious beliefs that are important to them and are willing to consider what I say only because they know I respect their beliefs. This work was partially supported by the U.S. Dept. of Energy, Office of Nuclear Physics, under contract DE-AC02-06CH11357.

  12. Uncertainty assessment tool for climate change impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Jacob, Daniela; Rechid, Diana; Lückenkötter, Johannes; Juckes, Martin

    2015-04-01

    A major difficulty in the study of climate change impact indicators is dealing with the numerous sources of uncertainties of climate and non-climate data . Its assessment, however, is needed to communicate to users the degree of certainty of climate change impact indicators. This communication of uncertainty is an important component of the FP7 project "Climate Information Portal for Copernicus" (CLIPC). CLIPC is developing a portal to provide a central point of access for authoritative scientific information on climate change. In this project the Climate Service Center 2.0 is in charge of the development of a tool to assess the uncertainty of climate change impact indicators. The calculation of climate change impact indicators will include climate data from satellite and in-situ observations, climate models and re-analyses, and non-climate data. There is a lack of a systematic classification of uncertainties arising from the whole range of climate change impact indicators. We develop a framework that intends to clarify the potential sources of uncertainty of a given indicator and provides - if possible - solutions how to quantify the uncertainties. To structure the sources of uncertainties of climate change impact indicators, we first classify uncertainties along a 'cascade of uncertainty' (Reyer 2013). Our cascade consists of three levels which correspond to the CLIPC meta-classification of impact indicators: Tier-1 indicators are intended to give information on the climate system. Tier-2 indicators attempt to quantify the impacts of climate change on biophysical systems (i.e. flood risks). Tier-3 indicators primarily aim at providing information on the socio-economic systems affected by climate change. At each level, the potential sources of uncertainty of the input data sets and its processing will be discussed. Reference: Reyer, C. (2013): The cascade of uncertainty in modeling forest ecosystem responses to environmental change and the challenge of sustainable

  13. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  14. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  15. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of

  16. Russia's scientific legacy

    NASA Astrophysics Data System (ADS)

    2015-01-01

    Many insights of Russian scientists are unknown or long-forgotten outside of Russia. Making the Russian literature accessible to the international scientific community could stimulate new lines of research.

  17. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  18. STARPROBE: Scientific rationale

    NASA Technical Reports Server (NTRS)

    Underwood, J. H. (Editor); Randolph, J. E. (Editor)

    1982-01-01

    The scientific rationale and instrumentation problems in the areas of solar internal dynamics and relativity, solar plasma and particle dynamics, and solar atmosphere structure were studied. Current STARPROBE mission and system design concepts are summarized.

  19. Scientific data requirements

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Each Scientific Data Requirement (SDR) is summarized in terms of professional discipline, research program, technical description, related parameters, geographical extent, resolution, error tolerance,space-based sensors systems, personnel, implementation expert, notes, and references.

  20. Anatomy of Scientific Evolution

    PubMed Central

    Yun, Jinhyuk; Kim, Pan-Jun; Jeong, Hawoong

    2015-01-01

    The quest for historically impactful science and technology provides invaluable insight into the innovation dynamics of human society, yet many studies are limited to qualitative and small-scale approaches. Here, we investigate scientific evolution through systematic analysis of a massive corpus of digitized English texts between 1800 and 2008. Our analysis reveals great predictability for long-prevailing scientific concepts based on the levels of their prior usage. Interestingly, once a threshold of early adoption rates is passed even slightly, scientific concepts can exhibit sudden leaps in their eventual lifetimes. We developed a mechanistic model to account for such results, indicating that slowly-but-commonly adopted science and technology surprisingly tend to have higher innate strength than fast-and-commonly adopted ones. The model prediction for disciplines other than science was also well verified. Our approach sheds light on unbiased and quantitative analysis of scientific evolution in society, and may provide a useful basis for policy-making. PMID:25671617

  1. The relationship between aerosol model uncertainty and radiative forcing uncertainty

    NASA Astrophysics Data System (ADS)

    Carslaw, Ken; Lee, Lindsay; Reddington, Carly

    2016-04-01

    There has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated aerosol-cloud forcing between pre-industrial and present day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the pre-industrial aerosol state. But the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are "equally acceptable" compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty, but this hides a range of very different aerosol models. These multiple so-called "equifinal" model variants predict a wide range of forcings. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  2. National Institutes of Health addresses the science of diversity

    PubMed Central

    Valantine, Hannah A.; Collins, Francis S.

    2015-01-01

    The US biomedical research workforce does not currently mirror the nation’s population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation’s health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity’s impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce. PMID:26392553

  3. National Institutes of Health addresses the science of diversity.

    PubMed

    Valantine, Hannah A; Collins, Francis S

    2015-10-01

    The US biomedical research workforce does not currently mirror the nation's population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation's health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity's impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce. PMID:26392553

  4. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    SciTech Connect

    Datta, D.

    2010-10-26

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  5. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while

  6. Uncertainty relations for angular momentum

    NASA Astrophysics Data System (ADS)

    Dammeier, Lars; Schwonnek, René; Werner, Reinhard F.

    2015-09-01

    In this work we study various notions of uncertainty for angular momentum in the spin-s representation of SU(2). We characterize the ‘uncertainty regions’ given by all vectors, whose components are specified by the variances of the three angular momentum components. A basic feature of this set is a lower bound for the sum of the three variances. We give a method for obtaining optimal lower bounds for uncertainty regions for general operator triples, and evaluate these for small s. Further lower bounds are derived by generalizing the technique by which Robertson obtained his state-dependent lower bound. These are optimal for large s, since they are saturated by states taken from the Holstein-Primakoff approximation. We show that, for all s, all variances are consistent with the so-called vector model, i.e., they can also be realized by a classical probability measure on a sphere of radius \\sqrt{s(s+1)}. Entropic uncertainty relations can be discussed similarly, but are minimized by different states than those minimizing the variances for small s. For large s the Maassen-Uffink bound becomes sharp and we explicitly describe the extremalizing states. Measurement uncertainty, as recently discussed by Busch, Lahti and Werner for position and momentum, is introduced and a generalized observable (POVM) which minimizes the worst case measurement uncertainty of all angular momentum components is explicitly determined, along with the minimal uncertainty. The output vectors for the optimal measurement all have the same length r(s), where r(s)/s\\to 1 as s\\to ∞ .

  7. Decisions on new product development under uncertainties

    NASA Astrophysics Data System (ADS)

    Huang, Yeu-Shiang; Liu, Li-Chen; Ho, Jyh-Wen

    2015-04-01

    In an intensively competitive market, developing a new product has become a valuable strategy for companies to establish their market positions and enhance their competitive advantages. Therefore, it is essential to effectively manage the process of new product development (NPD). However, since various problems may arise in NPD projects, managers should set up some milestones and subsequently construct evaluative mechanisms to assess their feasibility. This paper employed the approach of Bayesian decision analysis to deal with the two crucial uncertainties for NPD, which are the future market share and the responses of competitors. The proposed decision process can provide a systematic analytical procedure to determine whether an NPD project should be continued or not under the consideration of whether effective usage is being made of the organisational resources. Accordingly, the proposed decision model can assist the managers in effectively addressing the NPD issue under the competitive market.

  8. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  9. Visualizing uncertainty in biological expression data

    NASA Astrophysics Data System (ADS)

    Holzhüter, Clemens; Lex, Alexander; Schmalstieg, Dieter; Schulz, Hans-Jörg; Schumann, Heidrun; Streit, Marc

    2012-01-01

    Expression analysis of ~omics data using microarrays has become a standard procedure in the life sciences. However, microarrays are subject to technical limitations and errors, which render the data gathered likely to be uncertain. While a number of approaches exist to target this uncertainty statistically, it is hardly ever even shown when the data is visualized using for example clustered heatmaps. Yet, this is highly useful when trying not to omit data that is "good enough" for an analysis, which otherwise would be discarded as too unreliable by established conservative thresholds. Our approach addresses this shortcoming by first identifying the margin above the error threshold of uncertain, yet possibly still useful data. It then displays this uncertain data in the context of the valid data by enhancing a clustered heatmap. We employ different visual representations for the different kinds of uncertainty involved. Finally, it lets the user interactively adjust the thresholds, giving visual feedback in the heatmap representation, so that an informed choice on which thresholds to use can be made instead of applying the usual rule-of-thumb cut-offs. We exemplify the usefulness of our concept by giving details for a concrete use case from our partners at the Medical University of Graz, thereby demonstrating our implementation of the general approach.

  10. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  11. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    SciTech Connect

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.

  12. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  13. Uncertainty and extreme events in future climate and hydrologic projections for the Pacific Northwest: providing a basis for vulnerability and core/corridor assessments

    USGS Publications Warehouse

    Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.

    2014-01-01

    The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate

  14. A Physics MOSAIC: Scientific Skills and Explorations for Students

    NASA Astrophysics Data System (ADS)

    May, S.; Clements, C.; Erickson, P. J.; Rogers, A.

    2010-12-01

    A 21st century education needs to teach students how to manage information in an ever more digital age. High school students (like all of us) are inundated with information, and informed citizenship increasingly depends on the ability to be a critical consumer of data. In the scientific community, experimental data from remote, high quality systems are becoming increasingly available in real time. The same networks providing data also allow scientists to use the ubiquity of internet access to enlist citizen scientists to help with research. As a means of addressing and leveraging these trends, we describe a classroom unit developed as part of the NSF Research Experience for Teachers (RET) program at MIT Haystack Observatory in the summer of 2010. The unit uses accessible, real-time science data to teach high school physics students about the nature and process of scientific research, with the goal of teaching how to be an informed citizen, regardless of eventual vocation. The opportunity to study the atmosphere provides increased engagement in the classroom, and students have an authentic experience of asking and answering scientific questions when the answer cannot simply be found on the Web. MOSAIC (Mesospheric Ozone System for Atmospheric Investigations in the Classroom) is a relatively inexpensive tool for measuring mesospheric ozone by taking advantage of the sensitivity of commercially produced satellite TV dishes to the 11.072545 GHz rotational transition of ozone. Because the signal from ozone in the lower atmosphere is pressure-broadened, the system is able to isolate the signal from the 1% of Earth’s ozone that comes from the mesosphere. Our teaching unit takes advantage of measurements collected since 2008 from six East Coast observing sites at high schools and colleges. Data are available online within a day of their collection, and an easy to use web interface allows students to track mesospheric ozone in frequency, time of day, or day of year. The

  15. We underestimate uncertainties in our predictions.

    SciTech Connect

    Pilch, Martin M.

    2010-04-01

    Prediction is defined in the American Heritage Dictionary as follows: 'To state, tell about, or make known in advance, especially on the basis of special knowledge.' What special knowledge do we demand of modeling and simulation to assert that we have a predictive capability for high consequence applications? The 'special knowledge' question can be answered in two dimensions: the process and rigor by which modeling and simulation is executed and assessment results for the specific application. Here we focus on the process and rigor dimension and address predictive capability in terms of six attributes: (1) geometric and representational fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) validation, and (6) uncertainty quantification. This presentation will demonstrate through mini-tutorials, simple examples, and numerous case studies how each attribute creates opportunities for errors, biases, or uncertainties to enter into simulation results. The demonstrations will motivate a set of practices that minimize the risk in using modeling and simulation for high-consequence applications while defining important research directions. It is recognized that there are cultural, technical, infrastructure, and resource barriers that prevent analysts from performing all analyses at the highest levels of rigor. Consequently, the audience for this talk is (1) analysts, so they can know what is expected of them, (2) decision makers, so they can know what to expect from modeling and simulation, and (3) the R&D community, so they can address the technical and infrastructure issues that prevent analysts from executing analyses in a practical, timely, and quality manner.

  16. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    SciTech Connect

    Unal, Cetin; Williams, Brian; Mc Clure, Patrick; Nelson, Ralph A

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  17. Calculating LiDAR Point Cloud Uncertainty and Propagating Uncertainty to Snow-Water Equivalent Data Products

    NASA Astrophysics Data System (ADS)

    Gadomski, P. J.; Deems, J. S.; Glennie, C. L.; Hartzell, P. J.; Butler, H.; Finnegan, D. C.

    2015-12-01

    The use of high-resolution topographic data in the form of three-dimensional point clouds obtained from laser scanning systems (LiDAR) is becoming common across scientific disciplines.However little consideration has typically been given to the accuracy and the precision of LiDAR-derived measurements at the individual point scale.Numerous disparate sources contribute to the aggregate precision of each point measurement, including uncertainties in the range measurement, measurement of the attitude and position of the LiDAR collection platform, uncertainties associated with the interaction between the laser pulse and the target surface, and more.We have implemented open-source software tools to calculate per-point stochastic measurement errors for a point cloud using the general LiDAR georeferencing equation.We demonstrate the use of these propagated uncertainties by applying our methods to data collected by the Airborne Snow Observatory ALS, a NASA JPL project using a combination of airborne hyperspectral and LiDAR data to estimate snow-water equivalent distributions over full river basins.We present basin-scale snow depth maps with associated uncertainties, and demonstrate the propagation of those uncertainties to snow volume and snow-water equivalent calculations.

  18. An active learning approach with uncertainty, representativeness, and diversity.

    PubMed

    He, Tianxu; Zhang, Shukui; Xin, Jie; Zhao, Pengpeng; Wu, Jian; Xian, Xuefeng; Li, Chunhua; Cui, Zhiming

    2014-01-01

    Big data from the Internet of Things may create big challenge for data classification. Most active learning approaches select either uncertain or representative unlabeled instances to query their labels. Although several active learning algorithms have been proposed to combine the two criteria for query selection, they are usually ad hoc in finding unlabeled instances that are both informative and representative and fail to take the diversity of instances into account. We address this challenge by presenting a new active learning framework which considers uncertainty, representativeness, and diversity creation. The proposed approach provides a systematic way for measuring and combining the uncertainty, representativeness, and diversity of an instance. Firstly, use instances' uncertainty and representativeness to constitute the most informative set. Then, use the kernel k-means clustering algorithm to filter the redundant samples and the resulting samples are queried for labels. Extensive experimental results show that the proposed approach outperforms several state-of-the-art active learning approaches. PMID:25180208

  19. An Active Learning Approach with Uncertainty, Representativeness, and Diversity

    PubMed Central

    He, Tianxu; Zhang, Shukui; Xin, Jie; Xian, Xuefeng; Li, Chunhua; Cui, Zhiming

    2014-01-01

    Big data from the Internet of Things may create big challenge for data classification. Most active learning approaches select either uncertain or representative unlabeled instances to query their labels. Although several active learning algorithms have been proposed to combine the two criteria for query selection, they are usually ad hoc in finding unlabeled instances that are both informative and representative and fail to take the diversity of instances into account. We address this challenge by presenting a new active learning framework which considers uncertainty, representativeness, and diversity creation. The proposed approach provides a systematic way for measuring and combining the uncertainty, representativeness, and diversity of an instance. Firstly, use instances' uncertainty and representativeness to constitute the most informative set. Then, use the kernel k-means clustering algorithm to filter the redundant samples and the resulting samples are queried for labels. Extensive experimental results show that the proposed approach outperforms several state-of-the-art active learning approaches. PMID:25180208

  20. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a