Sample records for safety uncertainty assessment

  1. Assessment of SFR Wire Wrap Simulation Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical

  2. Generally Recognized as Safe: Uncertainty Surrounding E-Cigarette Flavoring Safety.

    PubMed

    Sears, Clara G; Hart, Joy L; Walker, Kandi L; Robertson, Rose Marie

    2017-10-23

    Despite scientific uncertainty regarding the relative safety of inhaling e-cigarette aerosol and flavorings, some consumers regard the U.S. Food and Drug Administration's "generally recognized as safe" (GRAS) designation as evidence of flavoring safety. In this study, we assessed how college students' perceptions of e-cigarette flavoring safety are related to understanding of the GRAS designation. During spring 2017, an online questionnaire was administered to college students. Chi-square p -values and multivariable logistic regression were employed to compare perceptions among participants considering e-cigarette flavorings as safe and those considering e-cigarette flavorings to be unsafe. The total sample size was 567 participants. Only 22% knew that GRAS designation meant that a product is safe to ingest, not inhale, inject, or use topically. Of participants who considered flavorings to be GRAS, the majority recognized that the designation meant a product is safe to ingest but also considered it safe to inhale. Although scientific uncertainty on the overall safety of flavorings in e-cigarettes remains, health messaging can educate the public about the GRAS designation and its irrelevance to e-cigarette safety.

  3. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally

  4. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, T.; Laville, C.; Dyrda, J.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplificationsmore » impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)« less

  5. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  6. Feedback from uncertainties propagation research projects conducted in different hydraulic fields: outcomes for engineering projects and nuclear safety assessment.

    NASA Astrophysics Data System (ADS)

    Bacchi, Vito; Duluc, Claire-Marie; Bertrand, Nathalie; Bardet, Lise

    2017-04-01

    In recent years, in the context of hydraulic risk assessment, much effort has been put into the development of sophisticated numerical model systems able reproducing surface flow field. These numerical models are based on a deterministic approach and the results are presented in terms of measurable quantities (water depths, flow velocities, etc…). However, the modelling of surface flows involves numerous uncertainties associated both to the numerical structure of the model, to the knowledge of the physical parameters which force the system and to the randomness inherent to natural phenomena. As a consequence, dealing with uncertainties can be a difficult task for both modelers and decision-makers [Ioss, 2011]. In the context of nuclear safety, IRSN assesses studies conducted by operators for different reference flood situations (local rain, small or large watershed flooding, sea levels, etc…), that are defined in the guide ASN N°13 [ASN, 2013]. The guide provides some recommendations to deal with uncertainties, by proposing a specific conservative approach to cover hydraulic modelling uncertainties. Depending of the situation, the influencing parameter might be the Strickler coefficient, levee behavior, simplified topographic assumptions, etc. Obviously, identifying the most influencing parameter and giving it a penalizing value is challenging and usually questionable. In this context, IRSN conducted cooperative (Compagnie Nationale du Rhone, I-CiTy laboratory of Polytech'Nice, Atomic Energy Commission, Bureau de Recherches Géologiques et Minières) research activities since 2011 in order to investigate feasibility and benefits of Uncertainties Analysis (UA) and Global Sensitivity Analysis (GSA) when applied to hydraulic modelling. A specific methodology was tested by using the computational environment Promethee, developed by IRSN, which allows carrying out uncertainties propagation study. This methodology was applied with various numerical models and in

  7. Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)

    EPA Science Inventory

    In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.

  8. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the

  9. Where do uncertainties reside within environmental risk assessments? Testing UnISERA, a guide for uncertainty assessment.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2017-06-01

    A means for identifying and prioritising the treatment of uncertainty (UnISERA) in environmental risk assessments (ERAs) is tested, using three risk domains where ERA is an established requirement and one in which ERA practice is emerging. UnISERA's development draws on 19 expert elicitations across genetically modified higher plants, particulate matter, and agricultural pesticide release and is stress tested here for engineered nanomaterials (ENM). We are concerned with the severity of uncertainty; its nature; and its location across four accepted stages of ERAs. Using an established uncertainty scale, the risk characterisation stage of ERA harbours the highest severity level of uncertainty, associated with estimating, aggregating and evaluating expressions of risk. Combined epistemic and aleatory uncertainty is the dominant nature of uncertainty. The dominant location of uncertainty is associated with data in problem formulation, exposure assessment and effects assessment. Testing UnISERA produced agreements of 55%, 90%, and 80% for the severity level, nature and location dimensions of uncertainty between the combined case studies and the ENM stress test. UnISERA enables environmental risk analysts to prioritise risk assessment phases, groups of tasks, or individual ERA tasks and it can direct them towards established methods for uncertainty treatment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  11. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  12. Safety Assessment for the Kozloduy National Disposal Facility in Bulgaria - 13507

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biurrun, E.; Haverkamp, B.; Lazaro, A.

    2013-07-01

    Due to the early decommissioning of four Water-Water Energy Reactors (WWER) 440-V230 reactors at the Nuclear Power Plant (NPP) near the city of Kozloduy in Bulgaria, large amounts of low and intermediate radioactive waste will arise much earlier than initially scheduled. In or-der to manage the radioactive waste from the early decommissioning, Bulgaria has intensified its efforts to provide a near surface disposal facility at Radiana with the required capacity. To this end, a project was launched and assigned in international competition to a German-Spanish consortium to provide the complete technical planning including the preparation of the Intermediate Safety Assessmentmore » Report. Preliminary results of operational and long-term safety show compliance with the Bulgarian regulatory requirements. The long-term calculations carried out for the Radiana site are also a good example of how analysis of safety assessment results can be used for iterative improvements of the assessment by pointing out uncertainties and areas of future investigations to reduce such uncertainties in regard to the potential radiological impact. The computer model used to estimate the long-term evolution of the future repository at Radiana predicted a maximum total annual dose for members of the critical group, which is carried to approximately 80 % by C-14 for a specific ingestion pathway. Based on this result and the outcome of the sensitivity analysis, existing uncertainties were evaluated and areas for reasonable future investigations to reduce these uncertainties were identified. (authors)« less

  13. Real-Time Safety Risk Assessment Based on a Real-Time Location System for Hydropower Construction Sites

    PubMed Central

    Fan, Qixiang; Qiang, Maoshan

    2014-01-01

    The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns. PMID:25114958

  14. Real-time safety risk assessment based on a real-time location system for hydropower construction sites.

    PubMed

    Jiang, Hanchen; Lin, Peng; Fan, Qixiang; Qiang, Maoshan

    2014-01-01

    The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns.

  15. Addressing uncertainty in vulnerability assessments [Chapter 5

    Treesearch

    Linda Joyce; Molly Cross; Evan Girvatz

    2011-01-01

    This chapter addresses issues and approaches for dealing with uncertainty specifically within the context of conducting climate change vulnerability assessments (i.e., uncertainties related to identifying and modeling the sensitivities, levels of exposure, and adaptive capacity of the assessment targets).

  16. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Health, safety and environmental unit performance assessment model under uncertainty (case study: steel industry).

    PubMed

    Shamaii, Azin; Omidvari, Manouchehr; Lotfi, Farhad Hosseinzadeh

    2017-01-01

    Performance assessment is a critical objective of management systems. As a result of the non-deterministic and qualitative nature of performance indicators, assessments are likely to be influenced by evaluators' personal judgments. Furthermore, in developing countries, performance assessments by the Health, Safety and Environment (HSE) department are based solely on the number of accidents. A questionnaire is used to conduct the study in one of the largest steel production companies in Iran. With respect to health, safety, and environment, the results revealed that control of disease, fire hazards, and air pollution are of paramount importance, with coefficients of 0.057, 0.062, and 0.054, respectively. Furthermore, health and environment indicators were found to be the most common causes of poor performance. Finally, it was shown that HSE management systems can affect the majority of performance safety indicators in the short run, whereas health and environment indicators require longer periods of time. The objective of this study is to present an HSE-MS unit performance assessment model in steel industries. Moreover, we seek to answer the following question: what are the factors that affect HSE unit system in the steel industry? Also, for each factor, the extent of impact on the performance of the HSE management system in the organization is determined.

  18. Assessing Uncertainty in Expert Judgments About Natural Resources

    Treesearch

    David A. Cleaves

    1994-01-01

    Judgments are necessary in natural resources management, but uncertainty about these judgments should be assessed. When all judgments are rejected in the absence of hard data, valuable professional experience and knowledge are not utilized fully. The objective of assessing uncertainty is to get the best representation of knowledge and its bounds. Uncertainty...

  19. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  20. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  1. Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability

    NASA Technical Reports Server (NTRS)

    Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian

    2011-01-01

    The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.

  2. Assessing uncertainties in land cover projections.

    PubMed

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover. © 2016 John Wiley & Sons Ltd.

  3. Flood resilience and uncertainty in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.

    2012-04-01

    Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.

  4. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  5. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  7. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  8. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  10. Decay heat uncertainty quantification of MYRRHA

    NASA Astrophysics Data System (ADS)

    Fiorito, Luca; Buss, Oliver; Hoefer, Axel; Stankovskiy, Alexey; Eynde, Gert Van den

    2017-09-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  11. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Sparks, R. S.

    2009-12-01

    many sources of uncertainty in forecasting the areas that volcanic activity will effect and the severity of the effects. Uncertainties arise from: natural variability, inadequate data, biased data, incomplete data, lack of understanding of the processes, limitations to predictive models, ambiguity, and unknown unknowns. The description of volcanic hazards is thus necessarily probabilistic and requires assessment of the attendant uncertainties. Several issues arise from the probabilistic nature of volcanic hazards and the intrinsic uncertainties. Although zonation maps require well-defined boundaries for administrative pragmatism, such boundaries cannot divide areas that are completely safe from those that are unsafe. Levels of danger or safety need to be defined to decide on and justify boundaries through the concepts of vulnerability and risk. More data, better observations, improved models may reduce uncertainties, but can increase uncertainties and may lead to re-appraisal of zone boundaries. Probabilities inferred by statistical techniques are hard to communicate. Expert elicitation is an emerging methodology for risk assessment and uncertainty evaluation. The method has been applied at one major volcanic crisis (Soufrière Hills Volcano, Montserrat), and is being applied in planning for volcanic crises at Vesuvius.

  12. Impact of nuclear data uncertainty on safety calculations for spent nuclear fuel geological disposal

    NASA Astrophysics Data System (ADS)

    Herrero, J. J.; Rochman, D.; Leray, O.; Vasiliev, A.; Pecchia, M.; Ferroukhi, H.; Caruso, S.

    2017-09-01

    In the design of a spent nuclear fuel disposal system, one necessary condition is to show that the configuration remains subcritical at time of emplacement but also during long periods covering up to 1,000,000 years. In the context of criticality safety applying burn-up credit, k-eff eigenvalue calculations are affected by nuclear data uncertainty mainly in the burnup calculations simulating reactor operation and in the criticality calculation for the disposal canister loaded with the spent fuel assemblies. The impact of nuclear data uncertainty should be included in the k-eff value estimation to enforce safety. Estimations of the uncertainty in the discharge compositions from the CASMO5 burn-up calculation phase are employed in the final MCNP6 criticality computations for the intact canister configuration; in between, SERPENT2 is employed to get the spent fuel composition along the decay periods. In this paper, nuclear data uncertainty was propagated by Monte Carlo sampling in the burn-up, decay and criticality calculation phases and representative values for fuel operated in a Swiss PWR plant will be presented as an estimation of its impact.

  13. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  14. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  15. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  16. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  17. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  18. Propagating uncertainty from hydrology into human health risk assessment

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2013-12-01

    Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.

  19. Scenario Analysis for the Safety Assessment of Nuclear Waste Repositories: A Critical Review.

    PubMed

    Tosoni, Edoardo; Salo, Ahti; Zio, Enrico

    2018-04-01

    A major challenge in scenario analysis for the safety assessment of nuclear waste repositories pertains to the comprehensiveness of the set of scenarios selected for assessing the safety of the repository. Motivated by this challenge, we discuss the aspects of scenario analysis relevant to comprehensiveness. Specifically, we note that (1) it is necessary to make it clear why scenarios usually focus on a restricted set of features, events, and processes; (2) there is not yet consensus on the interpretation of comprehensiveness for guiding the generation of scenarios; and (3) there is a need for sound approaches to the treatment of epistemic uncertainties. © 2017 Society for Risk Analysis.

  20. Communicating uncertainties in assessments of future sea level rise

    NASA Astrophysics Data System (ADS)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing

  1. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that

  2. System Level Uncertainty Assessment for Collaborative RLV Design

    NASA Technical Reports Server (NTRS)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  3. A fuzzy model for assessing risk of occupational safety in the processing industry.

    PubMed

    Tadic, Danijela; Djapan, Marko; Misita, Mirjana; Stefanovic, Miladin; Milanovic, Dragan D

    2012-01-01

    Managing occupational safety in any kind of industry, especially in processing, is very important and complex. This paper develops a new method for occupational risk assessment in the presence of uncertainties. Uncertain values of hazardous factors and consequence frequencies are described with linguistic expressions defined by a safety management team. They are modeled with fuzzy sets. Consequence severities depend on current hazardous factors, and their values are calculated with the proposed procedure. The proposed model is tested with real-life data from fruit processing firms in Central Serbia.

  4. HSE's safety assessment principles for criticality safety.

    PubMed

    Simister, D N; Finnerty, M D; Warburton, S J; Thomas, E A; Macphail, M R

    2008-06-01

    The Health and Safety Executive (HSE) published its revised Safety Assessment Principles for Nuclear Facilities (SAPs) in December 2006. The SAPs are primarily intended for use by HSE's inspectors when judging the adequacy of safety cases for nuclear facilities. The revised SAPs relate to all aspects of safety in nuclear facilities including the technical discipline of criticality safety. The purpose of this paper is to set out for the benefit of a wider audience some of the thinking behind the final published words and to provide an insight into the development of UK regulatory guidance. The paper notes that it is HSE's intention that the Safety Assessment Principles should be viewed as a reflection of good practice in the context of interpreting primary legislation such as the requirements under site licence conditions for arrangements for producing an adequate safety case and for producing a suitable and sufficient risk assessment under the Ionising Radiations Regulations 1999 (SI1999/3232 www.opsi.gov.uk/si/si1999/uksi_19993232_en.pdf).

  5. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    NASA Astrophysics Data System (ADS)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  6. MANAGING UNCERTAINTIES ASSOCIATED WITH RADIOACTIVE WASTE DISPOSAL: TASK GROUP 4 OF THE IAEA PRISM PROJECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seitz, R.

    2011-03-02

    It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest inmore » the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.« less

  7. Uncertainty analysis in geospatial merit matrix–based hydropower resource assessment

    DOE PAGES

    Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; ...

    2016-03-30

    Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less

  8. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    PubMed

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  9. Offshore safety case approach and formal safety assessment of ships.

    PubMed

    Wang, J

    2002-01-01

    Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.

  10. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF

  12. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  13. A review of uncertainty research in impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  14. Assessment and uncertainty analysis of groundwater risk.

    PubMed

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. The importance of hydrological uncertainty assessment methods in climate change impact studies

    NASA Astrophysics Data System (ADS)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind

  16. The default response to uncertainty and the importance of perceived safety in anxiety and stress: An evolution-theoretical perspective.

    PubMed

    Brosschot, Jos F; Verkuil, Bart; Thayer, Julian F

    2016-06-01

    From a combined neurobiological and evolution-theoretical perspective, the stress response is a subcortically subserved response to uncertainty that is not 'generated' but 'default': the stress response is 'always there' but as long as safety is perceived, the stress response is under tonic prefrontal inhibition, reflected by high vagally mediated heart rate variability. Uncertainty of safety leads to disinhibiting the default stress response, even in the absence of threat. Due to the stress response's survival value, this 'erring on the side of caution' is passed to us via our genes. Thus, intolerance of uncertainty is not acquired during the life cycle, but is a given property of all living organisms, only to be alleviated in situations of which the safety is learned. When the latter is deficient, generalized unsafety ensues, which underlies chronic anxiety and stress and their somatic health risks, as well as other highly prevalent conditions carrying such risks, including loneliness, obesity, aerobic unfitness and old age. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. International survey for good practices in forecasting uncertainty assessment and communication

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty

  18. Assessing the inherent uncertainty of one-dimensional diffusions

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Cohen, Morrel H.

    2013-01-01

    In this paper we assess the inherent uncertainty of one-dimensional diffusion processes via a stochasticity classification which provides an à la Mandelbrot categorization into five states of uncertainty: infra-mild, mild, borderline, wild, and ultra-wild. Two settings are considered. (i) Stopped diffusions: the diffusion initiates from a high level and is stopped once it first reaches a low level; in this setting we analyze the inherent uncertainty of the diffusion's maximal exceedance above its initial high level. (ii) Stationary diffusions: the diffusion is in dynamical statistical equilibrium; in this setting we analyze the inherent uncertainty of the diffusion's equilibrium level. In both settings general closed-form analytic results are established, and their application is exemplified by stock prices in the stopped-diffusions setting, and by interest rates in the stationary-diffusions setting. These results provide a highly implementable decision-making tool for the classification of uncertainty in the context of one-dimensional diffusions.

  19. Quantification for Complex Assessment: Uncertainty Estimation in Final Year Project Thesis Assessment

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2013-01-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final…

  20. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE PAGES

    Ilas, Germina; Liljenfeldt, Henrik

    2017-05-19

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  1. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Liljenfeldt, Henrik

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  2. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  3. Uncertainty and Risk Assessment in the Design Process for Wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick R.

    This report summarizes the concepts and opinions that emerged from an initial study on the subject of uncertainty in wind design that included expert elicitation during a workshop held at the National Wind Technology Center at the National Renewable Energy Laboratory July 12-13, 2016. In this paper, five major categories of uncertainties are identified. The first category is associated with direct impacts on turbine loads, (i.e., the inflow including extreme events, aero-hydro-servo-elastic response, soil-structure inter- action, and load extrapolation). The second category encompasses material behavior and strength. Site suitability and due-diligence aspects pertain to the third category. Calibration of partialmore » safety factors and optimal reliability levels make up the fourth one. And last but not least, is the category associated with uncertainties in computational modeling. The main sections of this paper follow this organization.« less

  4. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  5. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  6. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  7. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  8. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  9. Uncertainty after treatment for prostate cancer: definition, assessment, and management.

    PubMed

    Yu Ko, Wellam F; Degner, Lesley F

    2008-10-01

    Prostate cancer is the second most common type of cancer in men living in the United States and the most common type of malignancy in Canadian men, accounting for 186,320 new cases in the United States and 24,700 in Canada in 2008. Uncertainty, a component of all illness experiences, influences how men perceive the processes of treatment and adaptation. The Reconceptualized Uncertainty in Illness Theory explains the chronic nature of uncertainty in cancer survivorship by describing a shift from an emergent acute phase of uncertainty in survivors to a new level of uncertainty that is no longer acute and becomes a part of daily life. Proper assessment of certainty and uncertainty may allow nurses to maximize the effectiveness of patient-provider communication, cognitive reframing, and problem-solving interventions to reduce uncertainty after cancer treatment.

  10. Nuclear Data Uncertainty Propagation to Reactivity Coefficients of a Sodium Fast Reactor

    NASA Astrophysics Data System (ADS)

    Herrero, J. J.; Ochoa, R.; Martínez, J. S.; Díez, C. J.; García-Herranz, N.; Cabellos, O.

    2014-04-01

    The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.

  11. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  12. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  13. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to

  14. Correlation between safety climate and contractor safety assessment programs in construction

    PubMed Central

    Sparer, EH1; Murphy, LA; Taylor, KM; Dennerlein, Jt

    2015-01-01

    Background Contractor safety assessment programs (CSAPs) measure safety performance by integrating multiple data sources together; however, the relationship between these measures of safety performance and safety climate within the construction industry is unknown. Methods 401 construction workers employed by 68 companies on 26 sites and 11 safety managers employed by 11 companies completed brief surveys containing a nine-item safety climate scale developed for the construction industry. CSAP scores from ConstructSecure, Inc., an online CSAP database, classified these 68 companies as high or low scorers, with the median score of the sample population as the threshold. Spearman rank correlations evaluated the association between the CSAP score and the safety climate score at the individual level, as well as with various grouping methodologies. In addition, Spearman correlations evaluated the comparison between manager-assessed safety climate and worker-assessed safety climate. Results There were no statistically significant differences between safety climate scores reported by workers in the high and low CSAP groups. There were, at best, weak correlations between workers’ safety climate scores and the company CSAP scores, with marginal statistical significance with two groupings of the data. There were also no significant differences between the manager-assessed safety climate and the worker-assessed safety climate scores. Conclusions A CSAP safety performance score does not appear to capture safety climate, as measured in this study. The nature of safety climate in construction is complex, which may be reflective of the challenges in measuring safety climate within this industry. PMID:24038403

  15. Integrating Safety Assessment Methods using the Risk Informed Safety Margins Characterization (RISMC) Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; Diego Mandelli

    Safety is central to the design, licensing, operation, and economics of nuclear power plants (NPPs). As the current light water reactor (LWR) NPPs age beyond 60 years, there are possibilities for increased frequency of systems, structures, and components (SSC) degradations or failures that initiate safety significant events, reduce existing accident mitigation capabilities, or create new failure modes. Plant designers commonly “over-design” portions of NPPs and provide robustness in the form of redundant and diverse engineered safety features to ensure that, even in the case of well-beyond design basis scenarios, public health and safety will be protected with a very highmore » degree of assurance. This form of defense-in-depth is a reasoned response to uncertainties and is often referred to generically as “safety margin.” Historically, specific safety margin provisions have been formulated primarily based on engineering judgment backed by a set of conservative engineering calculations. The ability to better characterize and quantify safety margin is important to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development (R&D) in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the RISMC Pathway provides methods and tools that enable mitigation options known as margins management strategies. The purpose of the RISMC Pathway R&D is to support plant decisions for risk

  16. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL

  17. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  18. Assessing student understanding of measurement and uncertainty

    NASA Astrophysics Data System (ADS)

    Jirungnimitsakul, S.; Wattanakasiwich, P.

    2017-09-01

    The objectives of this study were to develop and assess student understanding of measurement and uncertainty. A test has been adapted and translated from the Laboratory Data Analysis Instrument (LDAI) test, consists of 25 questions focused on three topics including measures of central tendency, experimental errors and uncertainties, and fitting regression lines. The test was evaluated its content validity by three physics experts in teaching physics laboratory. In the pilot study, Thai LDAI was administered to 93 freshmen enrolled in a fundamental physics laboratory course. The final draft of the test was administered to three groups—45 freshmen taking fundamental physics laboratory, 16 sophomores taking intermediated physics laboratory and 21 juniors taking advanced physics laboratory at Chiang Mai University. As results, we found that the freshmen had difficulties in experimental errors and uncertainties. Most students had problems with fitting regression lines. These results will be used to improve teaching and learning physics laboratory for physics students in the department.

  19. Assessing model uncertainty using hexavalent chromium and ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47

  20. Correlation between safety climate and contractor safety assessment programs in construction.

    PubMed

    Sparer, Emily H; Murphy, Lauren A; Taylor, Kathryn M; Dennerlein, Jack T

    2013-12-01

    Contractor safety assessment programs (CSAPs) measure safety performance by integrating multiple data sources together; however, the relationship between these measures of safety performance and safety climate within the construction industry is unknown. Four hundred and one construction workers employed by 68 companies on 26 sites and 11 safety managers employed by 11 companies completed brief surveys containing a nine-item safety climate scale developed for the construction industry. CSAP scores from ConstructSecure, Inc., an online CSAP database, classified these 68 companies as high or low scorers, with the median score of the sample population as the threshold. Spearman rank correlations evaluated the association between the CSAP score and the safety climate score at the individual level, as well as with various grouping methodologies. In addition, Spearman correlations evaluated the comparison between manager-assessed safety climate and worker-assessed safety climate. There were no statistically significant differences between safety climate scores reported by workers in the high and low CSAP groups. There were, at best, weak correlations between workers' safety climate scores and the company CSAP scores, with marginal statistical significance with two groupings of the data. There were also no significant differences between the manager-assessed safety climate and the worker-assessed safety climate scores. A CSAP safety performance score does not appear to capture safety climate, as measured in this study. The nature of safety climate in construction is complex, which may be reflective of the challenges in measuring safety climate within this industry. Am. J. Ind. Med. 56:1463-1472, 2013. © 2013 Wiley Periodicals, Inc. © 2013 Wiley Periodicals, Inc.

  1. Assessing performance of flaw characterization methods through uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Miorelli, R.; Le Bourdais, F.; Artusi, X.

    2018-04-01

    In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.

  2. Experimental uncertainty survey and assessment. [Space Shuttle Main Engine testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.

    1992-01-01

    An uncertainty analysis and assessment of the specific impulse determination during Space Shuttle Main Engine testing is reported. It is concluded that in planning and designing tests and in interpreting the results of tests, the bias and precision components of experimental uncertainty should be considered separately. Recommendations for future research efforts are presented.

  3. Assessing uncertainty in published risk estimates using ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective is to characterize model uncertainty by evaluating estimates across published epidemiologic studies of the same cohort.Methods: This analysis was based on 5 studies analyzing a cohort of 2,357 workers employed from 1950-74 in a chromate production plant in Maryland. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability within and between model forms. A total of 5 similarly parameterized analyses were considered across model form, and 16 analyses with alternative parameterizations were considered within model form (10 Cox; 6 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients (betas) for 5 similar analyses ranged from 2.47 to 4.33 (mean=2.97, σ2=0.63). Within the 10 Cox models, coefficients ranged from 2.53 to 4.42 (mean=3.29, σ2=0.

  4. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  5. Quantification of uncertainties in global grazing systems assessment

    NASA Astrophysics Data System (ADS)

    Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.

    2017-07-01

    Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.

  6. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    NASA Astrophysics Data System (ADS)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  7. Assessing measurement uncertainty in meteorology in urban environments

    NASA Astrophysics Data System (ADS)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  8. Incorporating climate-system and carbon-cycle uncertainties in integrated assessments of climate change. (Invited)

    NASA Astrophysics Data System (ADS)

    Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.

    2013-12-01

    The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.

  9. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  10. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    NASA Astrophysics Data System (ADS)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  11. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  12. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    ERIC Educational Resources Information Center

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  13. Uncertainty factors in screening ecological risk assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, L.D.; Taggart, M.

    2000-06-01

    The hazard quotient (HQ) method is commonly used in screening ecological risk assessments (ERAs) to estimate risk to wildlife at contaminated sites. Many ERAs use uncertainty factors (UFs) in the HQ calculation to incorporate uncertainty associated with predicting wildlife responses to contaminant exposure using laboratory toxicity data. The overall objective was to evaluate the current UF methodology as applied to screening ERAs in California, USA. Specific objectives included characterizing current UF methodology, evaluating the degree of conservatism in UFs as applied, and identifying limitations to the current approach. Twenty-four of 29 evaluated ERAs used the HQ approach: 23 of thesemore » used UFs in the HQ calculation. All 24 made interspecies extrapolations, and 21 compensated for its uncertainty, most using allometric adjustments and some using RFs. Most also incorporated uncertainty for same-species extrapolations. Twenty-one ERAs used UFs extrapolating from lowest observed adverse effect level (LOAEL) to no observed adverse effect level (NOAEL), and 18 used UFs extrapolating from subchronic to chronic exposure. Values and application of all UF types were inconsistent. Maximum cumulative UFs ranged from 10 to 3,000. Results suggest UF methodology is widely used but inconsistently applied and is not uniformly conservative relative to UFs recommended in regulatory guidelines and academic literature. The method is limited by lack of consensus among scientists, regulators, and practitioners about magnitudes, types, and conceptual underpinnings of the UF methodology.« less

  14. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  15. Safety Auditing and Assessments

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald (Ronnie)

    2005-01-01

    Safety professionals typically do not engage in audits and independent assessments with the vigor as do our quality brethren. Taking advantage of industry and government experience conducting value added Independent Assessments or Audits benefits a safety program. Most other organizations simply call this process "internal audits." Sources of audit training are presented and compared. A relation of logic between audit techniques and mishap investigation is discussed. An example of an audit process is offered. Shortcomings and pitfalls of auditing are covered.

  16. Safety Auditing and Assessments

    NASA Astrophysics Data System (ADS)

    Goodin, Ronnie

    2005-12-01

    Safety professionals typically do not engage in audits and independent assessments with the vigor as do our quality brethren. Taking advantage of industry and government experience conducting value added Independent Assessments or Audits benefits a safety program. Most other organizations simply call this process "internal audits." Sources of audit training are presented and compared. A relation of logic between audit techniques and mishap investigation is discussed. An example of an audit process is offered. Shortcomings and pitfalls of auditing are covered.

  17. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  18. Assessing the safety effects of cooperative intelligent transport systems: A bowtie analysis approach.

    PubMed

    Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören

    2017-02-01

    The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment

    NASA Astrophysics Data System (ADS)

    Tesfamichael, A. A.; Jagath, K. J.

    2004-12-01

    Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.

  20. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  1. Assessment of herbal medicinal products: Challenges, and opportunities to increase the knowledge base for safety assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Scott A., E-mail: scott.jordan@hc-sc.gc.c; Cunningham, David G.; Marles, Robin J.

    Although herbal medicinal products (HMP) have been perceived by the public as relatively low risk, there has been more recognition of the potential risks associated with this type of product as the use of HMPs increases. Potential harm can occur via inherent toxicity of herbs, as well as from contamination, adulteration, plant misidentification, and interactions with other herbal products or pharmaceutical drugs. Regulatory safety assessment for HMPs relies on both the assessment of cases of adverse reactions and the review of published toxicity information. However, the conduct of such an integrated investigation has many challenges in terms of the quantitymore » and quality of information. Adverse reactions are under-reported, product quality may be less than ideal, herbs have a complex composition and there is lack of information on the toxicity of medicinal herbs or their constituents. Nevertheless, opportunities exist to capitalise on newer information to increase the current body of scientific evidence. Novel sources of information are reviewed, such as the use of poison control data to augment adverse reaction information from national pharmacovigilance databases, and the use of more recent toxicological assessment techniques such as predictive toxicology and omics. The integration of all available information can reduce the uncertainty in decision making with respect to herbal medicinal products. The example of Aristolochia and aristolochic acids is used to highlight the challenges related to safety assessment, and the opportunities that exist to more accurately elucidate the toxicity of herbal medicines.« less

  2. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision

  3. Assessing Uncertainties in a Simple and Cheap Experiment

    ERIC Educational Resources Information Center

    de Souza, Paulo A., Jr.; Brasil, Gutemberg Hespanha

    2009-01-01

    This paper describes how to calculate measurement uncertainties using as a practical example the assessment of the thickness of ping-pong balls and their material density. The advantages of a randomized experiment are also discussed. This experiment can be reproduced in the physics laboratory for undergraduate students. (Contains 7 tables, 1…

  4. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  5. Assessment of Electrical Safety in Afghanistan

    DTIC Science & Technology

    2009-07-24

    effectiveness of command efforts to ensure the electrical safety of Department of Defense occupied and constructed facilities in Afghanistan. We...March 31, 2009, we announced the Assessment of Electrical Safety in Afghanistan. The objective of this assessment was to review the effectiveness of...used contractors to review and identify electrical deficiencies to include life, health , and safety issues at FOBs. According to TF POWER

  6. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  7. Assessing student understanding of measurement and uncertainty

    NASA Astrophysics Data System (ADS)

    Abbott, David Scot

    A test to assess student understanding of measurement and uncertainty has been developed and administered to more than 500 students at two large research universities. The aim is two-fold: (1) to assess what students learn in the first semester of introductory physics labs and (2) to uncover patterns in student reasoning and practice. The forty minute, eleven item test focuses on direct measurement and student attitudes toward multiple measurements. After one revision cycle using think-aloud interviews, the test was administered to students to three groups: students enrolled in traditional laboratory lab sections of first semester physics at North Carolina State University (NCSU), students in an experimental (SCALE-UP) section of first semester physics at NCSU, and students in first semester physics at the University of North Carolina at Chapel Hill. The results were analyzed using a mixture of qualitative and quantitative methods. In the traditional NCSU labs, where students receive no instruction in uncertainty and measurement, students show no improvement on any of the areas examined by the test. In SCALE-UP and at UNC, students show statistically significant gains in most areas of the test. Gains on specific test items in SCALE-UP and at UNC correspond to areas of instructional emphasis. Test items were grouped into four main aspects of performance: "point/set" reasoning, meaning of spread, ruler reading and "stacking." Student performance on the pretest was examined to identify links between these aspects. Items within each aspect are correlated to one another, sometimes quite strongly, but items from different aspects rarely show statistically significant correlation. Taken together, these results suggest that student difficulties may not be linked to a single underlying cause. The study shows that current instruction techniques improve student understanding, but that many students exit the introductory physics lab course without appreciation or coherent

  8. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGES

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  9. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  10. Assessing uncertainties of GRACE-derived terrestrial water-storage fields

    NASA Astrophysics Data System (ADS)

    Fereria, Vagner; Montecino, Henry

    2017-04-01

    Space-borne sensors are producing many remotely sensed data and, consequently, different measurements of the same field are available to end users. Furthermore, different satellite processing centres are producing extensive products based on the data of only one mission. This is exactly the case with the Gravity Recovery and Climate Experiment (GRACE) mission, which has been monitoring terrestrial water storage (TWS) since April 2002, while the Centre for Space Research (CSR), the Jet Propulsion Laboratory (JPL), the GeoForschungsZentrum (GFZ), the Groupe de Recherche de Géodésie Spatiale (GRGS), among others, provide individual monthly solutions in the form of Stokes's coefficients. The inverted TWS maps from Stokes's coefficients are being used in many applications and, therefore, as no ground truth data exist, the uncertainties are unknown. An assessment of the uncertainties associated with these different products is mandatory in order to guide data producers and support the users to choose the best dataset. However, the estimation of uncertainties of space-borne products often relies on ground truth data, and in the absence of such data, an assessment of their qualities is a challenge. A recent study (Ferreira et al. 2016) evaluates the quality of each processing centre (CSR, JPL, GFZ, and GRGS) by estimating their individual uncertainties using a generalised formulation of the three-cornered hat (TCH) method. It was found that the TCH results for the study period of August 2002 to June 2014 indicate that on a global scale, the CSR, GFZ, GRGS, and JPL present uncertainties of 9.4, 13.7, 14.8, and 13.2 mm, respectively. On a basin scale, the overall good performance of the CSR is observed at 91 river basins. The TCH-based results are confirmed by a comparison with an ensemble solution from the four GRACE processing centres. Reference Ferreira VG, Montecino HDC, Yakubu CI and Heck B (2016) Uncertainties of the Gravity Recovery and Climate Experiment time

  11. Applicability and feasibility of systematic review for performing evidence-based risk assessment in food and feed safety.

    PubMed

    Aiassa, E; Higgins, J P T; Frampton, G K; Greiner, M; Afonso, A; Amzal, B; Deeks, J; Dorne, J-L; Glanville, J; Lövei, G L; Nienstedt, K; O'connor, A M; Pullin, A S; Rajić, A; Verloo, D

    2015-01-01

    Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare, and the environment. Systematic review and meta-analysis are established methods for answering questions in health care, and can be implemented to minimize biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all parameters required by a risk model may not be always feasible. This paper describes novel approaches for determining question suitability and for prioritizing questions for systematic review in this area. Risk assessment questions that aim to estimate a parameter are likely to be suitable for systematic review. Such questions can be structured by their "key elements" [e.g., for intervention questions, the population(s), intervention(s), comparator(s), and outcome(s)]. Prioritization of questions to be addressed by systematic review relies on the likely impact and related uncertainty of individual parameters in the risk model. This approach to planning and prioritizing systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment.

  12. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen

  13. Health and safety: Preliminary comparative assessment of the Satellite Power System (SPS) and other energy alternatives

    NASA Technical Reports Server (NTRS)

    Habegger, L. J.; Gasper, J. R.; Brown, C.

    1980-01-01

    Data readily available from the literature were used to make an initial comparison of the health and safety risks of a fission power system with fuel reprocessing; a combined-cycle coal power system with a low-Btu gasifier and open-cycle gas turbine; a central-station, terrestrial, solar photovoltaic power system; the satellite power system; and a first-generation fusion system. The assessment approach consists of the identification of health and safety issues in each phase of the energy cycle from raw material extraction through electrical generation, waste disposal, and system deactivation; quantitative or qualitative evaluation of impact severity; and the rating of each issue with regard to known or potential impact level and level of uncertainty.

  14. Characterizing Uncertainty and Variability in PBPK Models: State of the Science and Needs for Research and Implementation

    EPA Science Inventory

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variabilit...

  15. Constantly evolving safety assessment protocols for GM foods.

    PubMed

    Sesikeran, B; Vasanthi, Siruguri

    2008-01-01

    he introduction of GM foods has led to the evolution of a food safety assessment paradigm that establishes safety of the GM food relative to its conventional counterpart. The GM foods currently approved and marketed in several countries have undergone extensive safety testing under a structured safety assessment framework evolved by international organizations like FAO, WHO, Codex and OECD. The major elements of safety assessment include molecular characterization of inserted genes and stability of the trait, toxicity and allergenicity potential of the expressed substances, compositional analysis, potential for gene transfer to gut microflora and unintentional effects of the genetic modification. As more number and type of food crops are being brought under the genetic modification regime, the adequacy of existing safety assessment protocols for establishing safety of these foods has been questioned. Such crops comprise GM crops with higher agronomic vigour, nutritional or health benefit/ by modification of plant metabolic pathways and those expressing bioactive substances and pharmaceuticals. The safety assessment challenges of these foods are the potential of the methods to detect unintentional effects with higher sensitivity and rigor. Development of databases on food compositions, toxicants and allergens is currently seen as an important aid to development of safety protocols. With the changing global trends in genetic modification technology future challenge would be to develop GM crops with minimum amount of inserted foreign DNA so as to reduce the burden of complex safety assessments while ensuring safety and utility of the technology.

  16. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  17. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2011-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars entry vehicles. A survey was conducted of existing experimental heat-transfer and shock-shape data for high enthalpy, reacting-gas CO2 flows and five relevant test series were selected for comparison to predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared to these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  18. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2013-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars-entry vehicles. A survey was conducted of existing experimental heat transfer and shock-shape data for high-enthalpy reacting-gas CO2 flows, and five relevant test series were selected for comparison with predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared with these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  19. Assessment of Experimental Uncertainty for a Floating Wind Semisubmersible under Hydrodynamic Loading: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason

    The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less

  20. Handling glacially induced faults in the assessment of the long term safety of a repository for spent nuclear fuel at Forsmark, Sweden

    NASA Astrophysics Data System (ADS)

    Munier, R.

    2011-12-01

    Located deep into the Baltic shield, far from active plate boundaries and volcanism, Swedish bedrock is characterised by a low frequency of earthquakes of small magnitudes. Yet, faults, predominantly in the Lapland region, offsetting the quarternary regolith ten meters or more, reveal that Swedish bedrock suffered from substantial earthquake activity in connection to the retreat of the latest continental glacier, Weichsel. Storage of nuclear wastes, hazardous for hundreds of thousand years, requires, firstly, isolation of radionuclides and, secondly, retardation of the nuclides should the barriers fail. Swedish regulations require that safety is demonstrated for a period of a million years. Consequently, the repository must be designed to resist the impact of several continental glaciers. Large, glacially induced, earthquakes near the repository have the potential of triggering slip along fractures across the canisters containing the nuclear wastes, thereby simultaneously jeopardising isolation, retardation and, hence, long term safety. It has therefore been crucial to assess the impact of such intraplate earthquake upon the primary functions of the repository. We conclude that, by appropriate design of the repository, the negative impact of earthquakes on long term safety can be considerably lessened. We were, additionally, able to demonstrate compliance with Swedish regulations in our safety assessment, SR-Site, submitted to the authorities earlier this year. However, the assessment required a number of critical assumptions, e.g. concerning the strain rate and the fracture properties of the rock, many of which are subject of current research in the geoscientific community. By a conservative approach, though, we judge to have adequately propagated critical uncertainties through the assessment and bound the uncertainty space.

  1. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  2. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  3. Uncertainty in assessing value of oncology treatments.

    PubMed

    Mullins, C Daniel; Montgomery, Russ; Tunis, Sean

    2010-01-01

    Patients, clinicians, payers, and policymakers face an environment of significant evidentiary uncertainty as they attempt to achieve maximum value, or the greatest level of benefit possible at a given level of cost in their respective health care decisions. This is particularly true in the area of oncology, for which published evidence from clinical trials is often incongruent with real-world patient care, and a substantial portion of clinical use is for off-label indications that have not been systematically evaluated. It is this uncertainty in the knowledge of the clinical harms and benefits associated with oncology treatments that prevents postregulatory decision makers from making accurate assessments of the value of these treatments. Because of the incentives inherent in the clinical research enterprise, randomized control trials (RCTs) are designed for the specific purpose of regulatory approval and maximizing market penetration. The pursuit of these goals results in RCT study designs that achieve maximal internal validity at the expense of generalizability to diverse real-world patient populations that may have significant comorbidities and other clinically mitigating factors. As such, systematic reviews for the purposes of coverage and treatment decisions often find relevant and high-quality evidence to be limited or nonexistent. For a number of reasons, including frequent off-label use of medications and the expedited approval process for cancer drugs by the U.S. Food and Drug Administration, this situation is exacerbated in the area of oncology. This paper investigates the convergence of incentives and circumstances that lead to widespread uncertainty in oncology and proposes new paradigms for clinical research, including pragmatic clinical trials, methodological guidance, and coverage with evidence development. Each of these initiatives would support the design of clinical research that is more informative for postregulatory decision makers, and would

  4. Safety assessment of boron by application of new uncertainty factors and their subdivision.

    PubMed

    Hasegawa, Ryuichi; Hirata-Koizumi, Mutsuko; Dourson, Michael L; Parker, Ann; Ono, Atsushi; Hirose, Akihiko

    2013-02-01

    The available toxicity information for boron was reevaluated and four appropriate toxicity studies were selected in order to derive a tolerable daily intake (TDI) using newly proposed uncertainty factors (UFs) presented in Hasegawa et al. (2010). No observed adverse effect levels (NOAELs) of 17.5 and 8.8 mgB/kg/day for the critical effect of testicular toxicity were found in 2-year rat and dog feeding studies. Also, the 95% lower confidence limit of the benchmark doses for 5% reduction of fetal body weight (BMDL(05)) was calculated as 44.9 and 10.3 mgB/kg/day in mouse and rat developmental toxicity studies, respectively. Measured values available for differences in boron clearance between rats and humans and variability in the glomerular filtration rate (GFR) in pregnant women were used to derive chemical specific UFs. For the remaining uncertainty, newly proposed default UFs, which were derived from the latest applicable information with a probabilistic approach, and their subdivided factors for toxicokinetic and toxicodynamic variability were applied. Finally, overall UFs were calculated as 68 for rat testicular toxicity, 40 for dog testicular toxicity, 247 for mouse developmental toxicity and 78 for rat developmental toxicity. It is concluded that 0.13 mgB/kg/day is the most appropriate TDI for boron, based on rat developmental toxicity. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    EPA Science Inventory

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  6. Managing geological uncertainty in CO2-EOR reservoir assessments

    NASA Astrophysics Data System (ADS)

    Welkenhuysen, Kris; Piessens, Kris

    2014-05-01

    Recently the European Parliament has agreed that an atlas for the storage potential of CO2 is of high importance to have a successful commercial introduction of CCS (CO2 capture and geological storage) technology in Europe. CO2-enhanced oil recovery (CO2-EOR) is often proposed as a promising business case for CCS, and likely has a high potential in the North Sea region. Traditional economic assessments for CO2-EOR largely neglect the geological reality of reservoir uncertainties because these are difficult to introduce realistically in such calculations. There is indeed a gap between the outcome of a reservoir simulation and the input values for e.g. cost-benefit evaluations, especially where it concerns uncertainty. The approach outlined here is to turn the procedure around, and to start from which geological data is typically (or minimally) requested for an economic assessment. Thereafter it is evaluated how this data can realistically be provided by geologists and reservoir engineers. For the storage of CO2 these parameters are total and yearly CO2 injection capacity, and containment or potential on leakage. Specifically for the EOR operation, two additional parameters can be defined: the EOR ratio, or the ratio of recovered oil over injected CO2, and the CO2 recycling ratio of CO2 that is reproduced after breakthrough at the production well. A critical but typically estimated parameter for CO2-EOR projects is the EOR ratio, taken in this brief outline as an example. The EOR ratio depends mainly on local geology (e.g. injection per well), field design (e.g. number of wells), and time. Costs related to engineering can be estimated fairly good, given some uncertainty range. The problem is usually to reliably estimate the geological parameters that define the EOR ratio. Reliable data is only available from (onshore) CO2-EOR projects in the US. Published studies for the North Sea generally refer to these data in a simplified form, without uncertainty ranges, and are

  7. Bringing social standards into project evaluation under dynamic uncertainty.

    PubMed

    Knudsen, Odin K; Scandizzo, Pasquale L

    2005-04-01

    Society often sets social standards that define thresholds of damage to society or the environment above which compensation must be paid to the state or other parties. In this article, we analyze the interdependence between the use of social standards and investment evaluation under dynamic uncertainty where a negative externality above a threshold established by society requires an assessment and payment of damages. Under uncertainty, the party considering implementing a project or new technology must not only assess when the project is economically efficient to implement but when to abandon a project that could potentially exceed the social standard. Using real-option theory and simple models, we demonstrate how such a social standard can be integrated into cost-benefit analysis through the use of a development option and a liability option coupled with a damage function. Uncertainty, in fact, implies that both parties interpret the social standard as a target for safety rather than an inflexible barrier that cannot be overcome. The larger is the uncertainty, in fact, the greater will be the tolerance for damages in excess of the social standard from both parties.

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  9. Dealing With Uncertainty When Assessing Fish Passage Through Culvert Road Crossings

    NASA Astrophysics Data System (ADS)

    Anderson, Gregory B.; Freeman, Mary C.; Freeman, Byron J.; Straight, Carrie A.; Hagler, Megan M.; Peterson, James T.

    2012-09-01

    Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.

  10. Dealing with uncertainty when assessing fish passage through culvert road crossings.

    PubMed

    Anderson, Gregory B; Freeman, Mary C; Freeman, Byron J; Straight, Carrie A; Hagler, Megan M; Peterson, James T

    2012-09-01

    Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.

  11. Dealing with uncertainty when assessing fish passage through culvert road crossings

    USGS Publications Warehouse

    Anderson, Gregory B.; Freeman, Mary C.; Freeman, Byron J.; Straight, Carrie A.; Hagler, Megan M.; Peterson, James T.

    2012-01-01

    Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.

  12. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the

  13. Operational safety assessment of turbo generators with wavelet Rényi entropy from sensor-dependent vibration signals.

    PubMed

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-04-16

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals' wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance.

  14. Assessing uncertainty in high-resolution spatial climate data across the US Northeast.

    PubMed

    Bishop, Daniel A; Beier, Colin M

    2013-01-01

    Local and regional-scale knowledge of climate change is needed to model ecosystem responses, assess vulnerabilities and devise effective adaptation strategies. High-resolution gridded historical climate (GHC) products address this need, but come with multiple sources of uncertainty that are typically not well understood by data users. To better understand this uncertainty in a region with a complex climatology, we conducted a ground-truthing analysis of two 4 km GHC temperature products (PRISM and NRCC) for the US Northeast using 51 Cooperative Network (COOP) weather stations utilized by both GHC products. We estimated GHC prediction error for monthly temperature means and trends (1980-2009) across the US Northeast and evaluated any landscape effects (e.g., elevation, distance from coast) on those prediction errors. Results indicated that station-based prediction errors for the two GHC products were similar in magnitude, but on average, the NRCC product predicted cooler than observed temperature means and trends, while PRISM was cooler for means and warmer for trends. We found no evidence for systematic sources of uncertainty across the US Northeast, although errors were largest at high elevations. Errors in the coarse-scale (4 km) digital elevation models used by each product were correlated with temperature prediction errors, more so for NRCC than PRISM. In summary, uncertainty in spatial climate data has many sources and we recommend that data users develop an understanding of uncertainty at the appropriate scales for their purposes. To this end, we demonstrate a simple method for utilizing weather stations to assess local GHC uncertainty and inform decisions among alternative GHC products.

  15. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  16. The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.

    2010-12-01

    Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.

  17. Assessing uncertainties in superficial water provision by different bootstrap-based techniques

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario

    2014-05-01

    An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be

  18. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  19. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  20. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  1. Assessment of the uncertainty in future projection for summer climate extremes over the East Asia

    NASA Astrophysics Data System (ADS)

    Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun

    2017-04-01

    Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of

  2. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the

  3. [Status Quo, Uncertainties and Trends Analysis of Environmental Risk Assessment for PFASs].

    PubMed

    Hao, Xue-wen; Li, Li; Wang, Jie; Cao, Yan; Liu, Jian-guo

    2015-08-01

    This study systematically combed the definition and change of terms, category and application of perfluoroalkyl and polyfluoroalkyl substances (PFASs) in international academic, focusing on the environmental risk and exposure assessment of PFASs, to comprehensively analyze the current status, uncertainties and trends of PFASs' environmental risk assessment. Overall, the risk assessment of PFASs is facing a complicated situation involving complex substance pedigrees, various types, complex derivative relations, confidential business information and risk uncertainties. Although the environmental risk of long-chain PFASs has been widely recognized, the short-chain PFASs and short-chain fluorotelomers as their alternatives still have many research gaps and uncertainties in environmental hazards, environmental fate and exposure risk. The scope of risk control of PFASs in the international community is still worth discussing. Due to trade secrets and market competition, the chemical structure and risk information of PFASs' alternatives are generally lack of openness and transparency. The environmental risk of most fluorinated and non-fluorinated alternatives is not clear. In total, the international research on PFASs risk assessment gradually transfer from long-chain perfluoroalkyl acids (PFAAs) represented by perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) to short-chain PFAAs, and then extends to other PFASs. The main problems to be solved urgently and researched continuously are: the environmental hazardous assessment indexes, such as bioaccumulation and environmental migration, optimization method, the environmental release and multimedia environmental fate of short-chain PFASs; the environmental fate of neutral PFASs and the transformation and contribution as precursors of short-chain PFASs; the risk identification and assessment of fluorinated and non-fluorinated alternatives of PFASs.

  4. Operational Safety Assessment of Turbo Generators with Wavelet Rényi Entropy from Sensor-Dependent Vibration Signals

    PubMed Central

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-01-01

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals’ wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance. PMID:25894934

  5. Uncertainty quantification and reliability assessment in operational oil spill forecast modeling system.

    PubMed

    Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao

    2017-03-15

    As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component

  7. Using measurement uncertainty in decision-making and conformity assessment

    NASA Astrophysics Data System (ADS)

    Pendrill, L. R.

    2014-08-01

    Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such

  8. Consideration of the FQPA Safety Factor and Other Uncertainty Factors in Cumulative Risk Assessment of Chemicals Sharing a Common Mechanism of Toxicity

    EPA Pesticide Factsheets

    This guidance document provides OPP's current thinking on application of the provision in FFDCA about an additional safety factor for the protection of infants and children in the context of cumulative risk assessments.

  9. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Gregory M; Key, Brian P; Zerkle, David K

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which canmore » be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.« less

  10. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  11. Contribution of crop model structure, parameters and climate projections to uncertainty in climate change impact assessments.

    PubMed

    Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H

    2018-03-01

    Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive

  12. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program

  13. Characterizing Uncertainty and Variability in PBPK Models ...

    EPA Pesticide Factsheets

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro

  14. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  15. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A

  16. Intercomparison and Uncertainty Assessment of Nine Evapotranspiration Estimates Over South America

    NASA Astrophysics Data System (ADS)

    Sörensson, Anna A.; Ruscica, Romina C.

    2018-04-01

    This study examines the uncertainties and the representations of anomalies of a set of evapotranspiration products over climatologically distinct regions of South America. The products, coming from land surface models, reanalysis, and remote sensing, are chosen from sources that are readily available to the community of users. The results show that the spatial patterns of maximum uncertainty differ among metrics, with dry regions showing maximum relative uncertainties of annual mean evapotranspiration, while energy-limited regions present maximum uncertainties in the representation of the annual cycle and monsoon regions in the representation of anomalous conditions. Furthermore, it is found that land surface models driven by observed atmospheric fields detect meteorological and agricultural droughts in dry regions unequivocally. The remote sensing products employed do not distinguish all agricultural droughts and this could be attributed to the forcing net radiation. The study also highlights important characteristics of individual data sets and recommends users to include assessments of sensitivity to evapotranspiration data sets in their studies, depending on region and nature of study to be conducted.

  17. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  18. Nothing is safe: Intolerance of uncertainty is associated with compromised fear extinction learning.

    PubMed

    Morriss, Jayne; Christakou, Anastasia; van Reekum, Carien M

    2016-12-01

    Extinction-resistant fear is considered to be a central feature of pathological anxiety. Here we sought to determine if individual differences in Intolerance of Uncertainty (IU), a potential risk factor for anxiety disorders, underlies compromised fear extinction. We tested this hypothesis by recording electrodermal activity in 38 healthy participants during fear acquisition and extinction. We assessed the temporality of fear extinction, by examining early and late extinction learning. During early extinction, low IU was associated with larger skin conductance responses to learned threat vs. safety cues, whereas high IU was associated with skin conductance responding to both threat and safety cues, but no cue discrimination. During late extinction, low IU showed no difference in skin conductance between learned threat and safety cues, whilst high IU predicted continued fear expression to learned threat, indexed by larger skin conductance to threat vs. safety cues. These findings suggest a critical role of uncertainty-based mechanisms in the maintenance of learned fear. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  20. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  1. Essentiality, toxicity, and uncertainty in the risk assessment of manganese.

    PubMed

    Boyes, William K

    2010-01-01

    Risk assessments of manganese by inhalation or oral routes of exposure typically acknowledge the duality of manganese as an essential element at low doses and a toxic metal at high doses. Previously, however, risk assessors were unable to describe manganese pharmacokinetics quantitatively across dose levels and routes of exposure, to account for mass balance, and to incorporate this information into a quantitative risk assessment. In addition, the prior risk assessment of inhaled manganese conducted by the U.S. Environmental Protection Agency (EPA) identified a number of specific factors that contributed to uncertainty in the risk assessment. In response to a petition regarding the use of a fuel additive containing manganese, methylcyclopentadienyl manganese tricarbonyl (MMT), the U.S. EPA developed a test rule under the U.S. Clean Air Act that required, among other things, the generation of pharmacokinetic information. This information was intended not only to aid in the design of health outcome studies, but also to help address uncertainties in the risk assessment of manganese. To date, the work conducted in response to the test rule has yielded substantial pharmacokinetic data. This information will enable the generation of physiologically based pharmacokinetic (PBPK) models capable of making quantitative predictions of tissue manganese concentrations following inhalation and oral exposure, across dose levels, and accounting for factors such as duration of exposure, different species of manganese, and changes of age, gender, and reproductive status. The work accomplished in response to the test rule, in combination with other scientific evidence, will enable future manganese risk assessments to consider tissue dosimetry more comprehensively than was previously possible.

  2. Safety assessment of plant food supplements (PFS).

    PubMed

    van den Berg, Suzanne J P L; Serra-Majem, Lluis; Coppens, Patrick; Rietjens, Ivonne M C M

    2011-12-01

    Botanicals and botanical preparations, including plant food supplements (PFS), are widely used in Western diets. The growing use of PFS is accompanied by an increasing concern because the safety of these PFS is not generally assessed before they enter the market. Regulatory bodies have become more aware of this and are increasing their efforts to ensure the safety of PFS. The present review describes an overview of the general framework for the safety assessment of PFS, focusing on the different approaches currently in use to assess the safety of botanicals and/or botanical compounds, including their history of safe use, the tiered approach proposed by the European Food Safety Authority (EFSA), the Threshold of Toxicological Concern (TTC) and the Margin of Exposure (MOE) concept. Moreover, some examples of botanical compounds in PFS that may be of concern are discussed. Altogether, it is clear that "natural" does not equal "safe" and that PFS may contain compounds of concern at levels far above those found in the regular diet. In addition, the traditional use of a PFS compound as a herb or tea does not guarantee its safety when used as a supplement. This points at a need for stricter regulation and control of botanical containing products, especially given their expanding market volume.

  3. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  4. Safety Assessment of Polyether Lanolins as Used in Cosmetics.

    PubMed

    Becker, Lillian C; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan; Heldreth, Bart

    The Cosmetic Ingredient Review (CIR) Expert Panel (Panel) assessed the safety of 39 polyether lanolin ingredients as used in cosmetics. These ingredients function mostly as hair conditioning agents, skin conditioning agent-emollients, and surfactant-emulsifying agents. The Panel reviewed available animal and clinical data, from previous CIR safety assessments of related ingredients and components. The similar structure, properties, functions, and uses of these ingredients enabled grouping them and using the available toxicological data to assess the safety of the entire group. The Panel concluded that these polyether lanolin ingredients are safe in the practices of use and concentration as given in this safety assessment.

  5. Scheme for the selection of measurement uncertainty models in blood establishments' screening immunoassays.

    PubMed

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard; de Sousa, Gracinda

    2015-02-01

    Blood establishments routinely perform screening immunoassays to assess safety of the blood components. As with any other screening test, results have an inherent uncertainty. In blood establishments the major concern is the chance of false negatives, due to its possible impact on patients' health. This article briefly reviews GUM and diagnostic accuracy models for screening immunoassays, recommending a scheme to support the screening laboratories' staffs on the selection of a model considering the intended use of the screening results (i.e., post-transfusion safety). The discussion is grounded on a "risk-based thinking", risk being considered from the blood donor selection to the screening immunoassays. A combination of GUM and diagnostic accuracy models to evaluate measurement uncertainty in blood establishments is recommended. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  7. Need for an "integrated safety assessment" of GMOs, linking food safety and environmental considerations.

    PubMed

    Haslberger, Alexander G

    2006-05-03

    Evidence for substantial environmental influences on health and food safety comes from work with environmental health indicators which show that agroenvironmental practices have direct and indirect effects on human health, concluding that "the quality of the environment influences the quality and safety of foods" [Fennema, O. Environ. Health Perspect. 1990, 86, 229-232). In the field of genetically modified organisms (GMOs), Codex principles have been established for the assessment of GM food safety and the Cartagena Protocol on Biosafety outlines international principles for an environmental assessment of living modified organisms. Both concepts also contain starting points for an assessment of health/food safety effects of GMOs in cases when the environment is involved in the chain of events that could lead to hazards. The environment can act as a route of unintentional entry of GMOs into the food supply, such as in the case of gene flow via pollen or seeds from GM crops, but the environment can also be involved in changes of GMO-induced agricultural practices with relevance for health/food safety. Examples for this include potential regional changes of pesticide uses and reduction in pesticide poisonings resulting from the use of Bt crops or influences on immune responses via cross-reactivity. Clearly, modern methods of biotechnology in breeding are involved in the reasons behind the rapid reduction of local varieties in agrodiversity, which constitute an identified hazard for food safety and food security. The health/food safety assessment of GM foods in cases when the environment is involved needs to be informed by data from environmental assessment. Such data might be especially important for hazard identification and exposure assessment. International organizations working in these areas will very likely be needed to initiate and enable cooperation between those institutions responsible for the different assessments, as well as for exchange and analysis of

  8. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  9. Damage assessment of composite plate structures with material and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  10. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  11. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  12. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul; Al Hassan, Mohammad; Ring, Robert

    2017-01-01

    Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  13. Safety assessment and detection methods of genetically modified organisms.

    PubMed

    Xu, Rong; Zheng, Zhe; Jiao, Guanglian

    2014-01-01

    Genetically modified organisms (GMOs), are gaining importance in agriculture as well as the production of food and feed. Along with the development of GMOs, health and food safety concerns have been raised. These concerns for these new GMOs make it necessary to set up strict system on food safety assessment of GMOs. The food safety assessment of GMOs, current development status of safety and precise transgenic technologies and GMOs detection have been discussed in this review. The recent patents about GMOs and their detection methods are also reviewed. This review can provide elementary introduction on how to assess and detect GMOs.

  14. Analysis of safety impacts of access management alternatives using the surrogate safety assessment model : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    The purpose of this study was to evaluate if the Surrogate Safety Assessment Model (SSAM) could be used to assess the safety of a highway segment or an intersection in terms of the number and type of conflicts and to compare the safety effects of mul...

  15. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the

  16. Risk Assessment in Underground Coalmines Using Fuzzy Logic in the Presence of Uncertainty

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Ala, Charan Kumar

    2018-04-01

    Fatal accidents are occurring every year as regular events in Indian coal mining industry. To increase the safety conditions, it has become a prerequisite to performing a risk assessment of various operations in mines. However, due to uncertain accident data, it is hard to conduct a risk assessment in mines. The object of this study is to present a method to assess safety risks in underground coalmines. The assessment of safety risks is based on the fuzzy reasoning approach. Mamdani fuzzy logic model is developed in the fuzzy logic toolbox of MATLAB. A case study is used to demonstrate the applicability of the developed model. The summary of risk evaluation in case study mine indicated that mine fire has the highest risk level among all the hazard factors. This study could help the mine management to prepare safety measures based on the risk rankings obtained.

  17. Comparative safety assessment of plant-derived foods.

    PubMed

    Kok, E J; Keijer, J; Kleter, G A; Kuiper, H A

    2008-02-01

    The second generation of genetically modified (GM) plants that are moving towards the market are characterized by modifications that may be more complex and traits that more often are to the benefit of the consumer. These developments will have implications for the safety assessment of the resulting plant products. In part of the cases the same crop plant can, however, also be obtained by 'conventional' breeding strategies. The breeder will decide on a case-by-case basis what will be the best strategy to reach the set target and whether genetic modification will form part of this strategy. This article discusses important aspects of the safety assessment of complex products derived from newly bred plant varieties obtained by different breeding strategies. On the basis of this overview, we conclude that the current process of the safety evaluation of GM versus conventionally bred plants is not well balanced. GM varieties are elaborately assessed, yet at the same time other crop plants resulting from conventional breeding strategies may warrant further food safety assessment for the benefit of the consumer. We propose to develop a general screening frame for all newly developed plant varieties to select varieties that cannot, on the basis of scientific criteria, be considered as safe as plant varieties that are already on the market.

  18. Initial development of a practical safety audit tool to assess fleet safety management practices.

    PubMed

    Mitchell, Rebecca; Friswell, Rena; Mooren, Lori

    2012-07-01

    Work-related vehicle crashes are a common cause of occupational injury. Yet, there are few studies that investigate management practices used for light vehicle fleets (i.e. vehicles less than 4.5 tonnes). One of the impediments to obtaining and sharing information on effective fleet safety management is the lack of an evidence-based, standardised measurement tool. This article describes the initial development of an audit tool to assess fleet safety management practices in light vehicle fleets. The audit tool was developed by triangulating information from a review of the literature on fleet safety management practices and from semi-structured interviews with 15 fleet managers and 21 fleet drivers. A preliminary useability assessment was conducted with 5 organisations. The audit tool assesses the management of fleet safety against five core categories: (1) management, systems and processes; (2) monitoring and assessment; (3) employee recruitment, training and education; (4) vehicle technology, selection and maintenance; and (5) vehicle journeys. Each of these core categories has between 1 and 3 sub-categories. Organisations are rated at one of 4 levels on each sub-category. The fleet safety management audit tool is designed to identify the extent to which fleet safety is managed in an organisation against best practice. It is intended that the audit tool be used to conduct audits within an organisation to provide an indicator of progress in managing fleet safety and to consistently benchmark performance against other organisations. Application of the tool by fleet safety researchers is now needed to inform its further development and refinement and to permit psychometric evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Animal-Free Chemical Safety Assessment

    PubMed Central

    Loizou, George D.

    2016-01-01

    The exponential growth of the Internet of Things and the global popularity and remarkable decline in cost of the mobile phone is driving the digital transformation of medical practice. The rapidly maturing digital, non-medical world of mobile (wireless) devices, cloud computing and social networking is coalescing with the emerging digital medical world of omics data, biosensors and advanced imaging which offers the increasingly realistic prospect of personalized medicine. Described as a potential “seismic” shift from the current “healthcare” model to a “wellness” paradigm that is predictive, preventative, personalized and participatory, this change is based on the development of increasingly sophisticated biosensors which can track and measure key biochemical variables in people. Additional key drivers in this shift are metabolomic and proteomic signatures, which are increasingly being reported as pre-symptomatic, diagnostic and prognostic of toxicity and disease. These advancements also have profound implications for toxicological evaluation and safety assessment of pharmaceuticals and environmental chemicals. An approach based primarily on human in vivo and high-throughput in vitro human cell-line data is a distinct possibility. This would transform current chemical safety assessment practice which operates in a human “data poor” to a human “data rich” environment. This could also lead to a seismic shift from the current animal-based to an animal-free chemical safety assessment paradigm. PMID:27493630

  20. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  2. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  3. Dynamic rating curve assessment in hydrometric stations and calculation of the associated uncertainties : Quality and monitoring indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine

    2013-04-01

    Whether we talk about safety reasons, energy production or regulation, water resources management is one of EDF's (French hydropower company) main concerns. To meet these needs, since the fifties EDF-DTG operates a hydrometric network that includes more than 350 hydrometric stations. The data collected allow real time monitoring of rivers (hydro meteorological forecasts at points of interests), as well as hydrological studies and the sizing of structures. Ensuring the quality of stream flow data is a priority. A rating curve is an indirect method of estimating the discharge in rivers based on water level measurements. The value of discharge obtained thanks to the rating curve is not entirely accurate due to the constant changes of the river bed morphology, to the precision of the gaugings (direct and punctual discharge measurements) and to the quality of the tracing. As time goes on, the uncertainty of the estimated discharge from a rating curve « gets older » and increases: therefore the final level of uncertainty remains particularly difficult to assess. Moreover, the current EDF capacity to produce a rating curve is not suited to the frequency of change of the stage-discharge relationship. The actual method does not take into consideration the variation of the flow conditions and the modifications of the river bed which occur due to natural processes such as erosion, sedimentation and seasonal vegetation growth. In order to get the most accurate stream flow data and to improve their reliability, this study undertakes an original « dynamic» method to compute rating curves based on historical gaugings from a hydrometric station. A curve is computed for each new gauging and a model of uncertainty is adjusted for each of them. The model of uncertainty takes into account the inaccuracies in the measurement of the water height, the quality of the tracing, the uncertainty of the gaugings and the aging of the confidence intervals calculated with a variographic

  4. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    NASA Astrophysics Data System (ADS)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  5. Assessment of uncertainties of the models used in thermal-hydraulic computer codes

    NASA Astrophysics Data System (ADS)

    Gricay, A. S.; Migrov, Yu. A.

    2015-09-01

    The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.

  6. Dispelling urban myths about default uncertainty factors in chemical risk assessment--sufficient protection against mixture effects?

    PubMed

    Martin, Olwenn V; Martin, Scholze; Kortenkamp, Andreas

    2013-07-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an

  7. Quantified Uncertainties in Comparative Life Cycle Assessment: What Can Be Concluded?

    PubMed Central

    2018-01-01

    Interpretation of comparative Life Cycle Assessment (LCA) results can be challenging in the presence of uncertainty. To aid in interpreting such results under the goal of any comparative LCA, we aim to provide guidance to practitioners by gaining insights into uncertainty-statistics methods (USMs). We review five USMs—discernibility analysis, impact category relevance, overlap area of probability distributions, null hypothesis significance testing (NHST), and modified NHST–and provide a common notation, terminology, and calculation platform. We further cross-compare all USMs by applying them to a case study on electric cars. USMs belong to a confirmatory or an exploratory statistics’ branch, each serving different purposes to practitioners. Results highlight that common uncertainties and the magnitude of differences per impact are key in offering reliable insights. Common uncertainties are particularly important as disregarding them can lead to incorrect recommendations. On the basis of these considerations, we recommend the modified NHST as a confirmatory USM. We also recommend discernibility analysis as an exploratory USM along with recommendations for its improvement, as it disregards the magnitude of the differences. While further research is necessary to support our conclusions, the results and supporting material provided can help LCA practitioners in delivering a more robust basis for decision-making. PMID:29406730

  8. Safety assessment for hair-spray resins: risk assessment based on rodent inhalation studies.

    PubMed

    Carthew, Philip; Griffiths, Heather; Keech, Stephen; Hartop, Peter

    2002-04-01

    The methods involved in the safety assessment of resins used in hair-spray products have received little peer review, or debate in the published literature, despite their widespread use, in both hairdressing salons and the home. The safety assessment for these resins currently involves determining the type of lung pathology that can be caused in animal inhalation exposure studies, and establishing the no-observable-effect level (NOEL) for these pathologies. The likely human consumer exposure is determined by techniques that model the simulated exposure under "in use" conditions. From these values it is then possible to derive the likely safety factors for human exposure. An important part of this process would be to recognize the intrinsic differences between rodents and humans in terms of the respiratory doses that each species experiences during inhalation exposures, for the purpose of the safety assessment. Interspecies scaling factors become necessary when comparing the exposure doses experienced by rats, compared to humans, because of basic differences between species in lung clearance rates and the alveolar area in the lungs. The rodent inhalation data and modeled human exposure to Resin 6965, a resin polymer that is based on vinyl acetate, has been used to calculate the safety factor for human consumer exposure to this resin, under a range of "in use" exposure conditions. The use of this safety assessment process clearly demonstrates that Resin 6965 is acceptable for human consumer exposure under the conditions considered in this risk assessment.

  9. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  10. Space Radiation Cancer Risks and Uncertainties for Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Badhwar, G. D.; Saganti, P. B.; Dicello, J. F.

    2001-01-01

    Projecting cancer risks from exposure to space radiation is highly uncertain because of the absence of data for humans and because of the limited radiobiology data available for estimating late effects from the high-energy and charge (HZE) ions present in the galactic cosmic rays (GCR). Cancer risk projections involve many biological and physical factors, each of which has a differential range of uncertainty due to the lack of data and knowledge. We discuss an uncertainty assessment within the linear-additivity model using the approach of Monte Carlo sampling from subjective error distributions that represent the lack of knowledge in each factor to quantify the overall uncertainty in risk projections. Calculations are performed using the space radiation environment and transport codes for several Mars mission scenarios. This approach leads to estimates of the uncertainties in cancer risk projections of 400-600% for a Mars mission. The uncertainties in the quality factors are dominant. Using safety standards developed for low-Earth orbit, long-term space missions (>90 days) outside the Earth's magnetic field are currently unacceptable if the confidence levels in risk projections are considered. Because GCR exposures involve multiple particle or delta-ray tracks per cellular array, our results suggest that the shape of the dose response at low dose rates may be an additional uncertainty for estimating space radiation risks.

  11. Pharmacological mechanism-based drug safety assessment and prediction.

    PubMed

    Abernethy, D R; Woodcock, J; Lesko, L J

    2011-06-01

    Advances in cheminformatics, bioinformatics, and pharmacology in the context of biological systems are now at a point that these tools can be applied to mechanism-based drug safety assessment and prediction. The development of such predictive tools at the US Food and Drug Administration (FDA) will complement ongoing efforts in drug safety that are focused on spontaneous adverse event reporting and active surveillance to monitor drug safety. This effort will require the active collaboration of scientists in the pharmaceutical industry, academe, and the National Institutes of Health, as well as those at the FDA, to reach its full potential. Here, we describe the approaches and goals for the mechanism-based drug safety assessment and prediction program.

  12. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by

  13. Assessing the Uncertainties on Seismic Source Parameters: Towards Realistic Estimates of Moment Tensor Determinations

    NASA Astrophysics Data System (ADS)

    Magnoni, F.; Scognamiglio, L.; Tinti, E.; Casarotti, E.

    2014-12-01

    Seismic moment tensor is one of the most important source parameters defining the earthquake dimension and style of the activated fault. Moment tensor catalogues are ordinarily used by geoscientists, however, few attempts have been done to assess possible impacts of moment magnitude uncertainties upon their own analysis. The 2012 May 20 Emilia mainshock is a representative event since it is defined in literature with a moment magnitude value (Mw) spanning between 5.63 and 6.12. An uncertainty of ~0.5 units in magnitude leads to a controversial knowledge of the real size of the event. The possible uncertainty associated to this estimate could be critical for the inference of other seismological parameters, suggesting caution for seismic hazard assessment, coulomb stress transfer determination and other analyses where self-consistency is important. In this work, we focus on the variability of the moment tensor solution, highlighting the effect of four different velocity models, different types and ranges of filtering, and two different methodologies. Using a larger dataset, to better quantify the source parameter uncertainty, we also analyze the variability of the moment tensor solutions depending on the number, the epicentral distance and the azimuth of used stations. We endorse that the estimate of seismic moment from moment tensor solutions, as well as the estimate of the other kinematic source parameters, cannot be considered an absolute value and requires to come out with the related uncertainties and in a reproducible framework characterized by disclosed assumptions and explicit processing workflows.

  14. Uncertainty in assessment of radiation-induced diffusion index changes in individual patients

    NASA Astrophysics Data System (ADS)

    Nazem-Zadeh, Mohammad-Reza; Chapman, Christopher H.; Lawrence, Theodore S.; Tsien, Christina I.; Cao, Yue

    2013-06-01

    The purpose of this study is to evaluate repeatability coefficients of diffusion tensor indices to assess whether longitudinal changes in diffusion indices were true changes beyond the uncertainty for individual patients undergoing radiation therapy (RT). Twenty-two patients who had low-grade or benign tumors and were treated by partial brain radiation therapy (PBRT) participated in an IRB-approved MRI protocol. The diffusion tensor images in the patients were acquired pre-RT, week 3 during RT, at the end of RT, and 1, 6, and 18 months after RT. As a measure of uncertainty, repeatability coefficients (RC) of diffusion indices in the segmented cingulum, corpus callosum, and fornix were estimated by using test-retest diffusion tensor datasets from the National Biomedical Imaging Archive (NBIA) database. The upper and lower limits of the 95% confidence interval of the estimated RC from the test and retest data were used to evaluate whether the longitudinal percentage changes in diffusion indices in the segmented structures in the individual patients were beyond the uncertainty and thus could be considered as true radiation-induced changes. Diffusion indices in different white matter structures showed different uncertainty ranges. The estimated RC for fractional anisotropy (FA) ranged from 5.3% to 9.6%, for mean diffusivity (MD) from 2.2% to 6.8%, for axial diffusivity (AD) from 2.4% to 5.5%, and for radial diffusivity (RD) from 2.9% to 9.7%. Overall, 23% of the patients treated by RT had FA changes, 44% had MD changes, 50% had AD changes, and 50% had RD changes beyond the uncertainty ranges. In the fornix, 85.7% and 100% of the patients showed changes beyond the uncertainty range at 6 and 18 months after RT, demonstrating that radiation has a pronounced late effect on the fornix compared to other segmented structures. It is critical to determine reliability of a change observed in an individual patient for clinical decision making. Assessments of the repeatability and

  15. Relationship between Physicians' Uncertainty about Clinical Assessments and Patient-Centered Recommendations for Colorectal Cancer Screening in the Elderly.

    PubMed

    Dalton, Alexandra F; Golin, Carol E; Esserman, Denise; Pignone, Michael P; Pathman, Donald E; Lewis, Carmen L

    2015-05-01

    The goal of this study was to examine associations between physicians' clinical assessments, their certainty in these assessments, and the likelihood of a patient-centered recommendation about colorectal cancer (CRC) screening in the elderly. Two hundred seventy-six primary care physicians in the United States read 3 vignettes about an 80-year-old female patient and answered questions about her life expectancy, their confidence in their life expectancy estimate, the balance of benefits/downsides of CRC screening, their certainty in their benefit/downside assessment, and the best course of action regarding CRC screening. We used logistic regression to determine the relationship between these variables and patient-centered recommendations about CRC screening. In bivariate analyses, physicians had higher odds of making a patient-centered recommendation about CRC screening when their clinical assessments did not lead to a clear screening recommendation or when they experienced uncertainty in their clinical assessments. However, in a multivariate regression model, only benefit/downside assessment and best course of action remained statistically significant predictors of a patient-centered recommendation. Our findings demonstrate that when the results of clinical assessments do not lead to obvious screening decisions or when physicians feel uncertain about their clinical assessments, they are more likely to make patient-centered recommendations. Existing uncertainty frameworks do not adequately describe the uncertainty associated with patient-centered recommendations found in this study. Adapting or modifying these frameworks to better reflect the constructs associated with uncertainty and the interactions between uncertainty and the complexity inherent in clinical decisions will facilitate a more complete understanding of how and when physicians choose to include patients in clinical decisions. © The Author(s) 2015.

  16. LANL Safety Conscious Work Environment (SCWE) Self-Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargis, Barbara C.

    2014-01-29

    On December 21, 2012 Secretary of Energy Chu transmitted to the Defense Nuclear Facilities Safety Board (DNFSB) revised commitments on the implementation plan for Safety Culture at the Waste Treatment and Immobilization Plant. Action 2-5 was revised to require contractors and federal organizations to complete Safety Conscious Work Environment (SCWE) selfassessments and provide reports to the appropriate U.S. Department of Energy (DOE) - Headquarters Program Office by September 2013. Los Alamos National Laboratory (LANL) planned and conducted a Safety Conscious Work Environment (SCWE) Self-Assessment over the time period July through August, 2013 in accordance with the SCWE Self-Assessment Guidance providedmore » by DOE. Significant field work was conducted over the 2-week period August 5-16, 2013. The purpose of the self-assessment was to evaluate whether programs and processes associated with a SCWE are in place and whether they are effective in supporting and promoting a SCWE.« less

  17. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range

  18. Probabilistic approach for decay heat uncertainty estimation using URANIE platform and MENDEL depletion code

    NASA Astrophysics Data System (ADS)

    Tsilanizara, A.; Gilardi, N.; Huynh, T. D.; Jouanne, C.; Lahaye, S.; Martinez, J. M.; Diop, C. M.

    2014-06-01

    The knowledge of the decay heat quantity and the associated uncertainties are important issues for the safety of nuclear facilities. Many codes are available to estimate the decay heat. ORIGEN, FISPACT, DARWIN/PEPIN2 are part of them. MENDEL is a new depletion code developed at CEA, with new software architecture, devoted to the calculation of physical quantities related to fuel cycle studies, in particular decay heat. The purpose of this paper is to present a probabilistic approach to assess decay heat uncertainty due to the decay data uncertainties from nuclear data evaluation like JEFF-3.1.1 or ENDF/B-VII.1. This probabilistic approach is based both on MENDEL code and URANIE software which is a CEA uncertainty analysis platform. As preliminary applications, single thermal fission of uranium 235, plutonium 239 and PWR UOx spent fuel cell are investigated.

  19. Safety management system needs assessment.

    DOT National Transportation Integrated Search

    2016-04-01

    The safety of the traveling public is critical as each year there are approximately 200 highway fatalities in Nebraska and numerous crash injuries. The objective of this research was to conduct a needs assessment to identify the requirements of a sta...

  20. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  1. Methodology to assess clinical liver safety data.

    PubMed

    Merz, Michael; Lee, Kwan R; Kullak-Ublick, Gerd A; Brueckner, Andreas; Watkins, Paul B

    2014-11-01

    Analysis of liver safety data has to be multivariate by nature and needs to take into account time dependency of observations. Current standard tools for liver safety assessment such as summary tables, individual data listings, and narratives address these requirements to a limited extent only. Using graphics in the context of a systematic workflow including predefined graph templates is a valuable addition to standard instruments, helping to ensure completeness of evaluation, and supporting both hypothesis generation and testing. Employing graphical workflows interactively allows analysis in a team-based setting and facilitates identification of the most suitable graphics for publishing and regulatory reporting. Another important tool is statistical outlier detection, accounting for the fact that for assessment of Drug-Induced Liver Injury, identification and thorough evaluation of extreme values has much more relevance than measures of central tendency in the data. Taken together, systematical graphical data exploration and statistical outlier detection may have the potential to significantly improve assessment and interpretation of clinical liver safety data. A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials.

  2. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  3. Dispelling urban myths about default uncertainty factors in chemical risk assessment – sufficient protection against mixture effects?

    PubMed Central

    2013-01-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an

  4. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    NASA Astrophysics Data System (ADS)

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  5. Data related uncertainty in near-surface vulnerability assessments for agrochemicals in the San Joaquin Valley.

    PubMed

    Loague, Keith; Blanke, James S; Mills, Melissa B; Diaz-Diaz, Ricardo; Corwin, Dennis L

    2012-01-01

    Precious groundwater resources across the United States have been contaminated due to decades-long nonpoint-source applications of agricultural chemicals. Assessing the impact of past, ongoing, and future chemical applications for large-scale agriculture operations is timely for designing best-management practices to prevent subsurface pollution. Presented here are the results from a series of regional-scale vulnerability assessments for the San Joaquin Valley (SJV). Two relatively simple indices, the retardation and attenuation factors, are used to estimate near-surface vulnerabilities based on the chemical properties of 32 pesticides and the variability of both soil characteristics and recharge rates across the SJV. The uncertainties inherit to these assessments, derived from the uncertainties within the chemical and soil data bases, are estimated using first-order analyses. The results are used to screen and rank the chemicals based on mobility and leaching potential, without and with consideration of data-related uncertainties. Chemicals of historic high visibility in the SJV (e.g., atrazine, DBCP [dibromochloropropane], ethylene dibromide, and simazine) are ranked in the top half of those considered. Vulnerability maps generated for atrazine and DBCP, featured for their legacy status in the study area, clearly illustrate variations within and across the assessments. For example, the leaching potential is greater for DBCP than for atrazine, the leaching potential for DBCP is greater for the spatially variable recharge values than for the average recharge rate, and the leaching potentials for both DBCP and atrazine are greater for the annual recharge estimates than for the monthly recharge estimates. The data-related uncertainties identified in this study can be significant, targeting opportunities for improving future vulnerability assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America

  6. Evaluating uncertainty to strengthen epidemiologic data for use in human health risk assessments

    EPA Science Inventory

    Background: There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. While most epidemiologic studies result in uncertainty, tec...

  7. Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling

    NASA Astrophysics Data System (ADS)

    Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa

    2018-02-01

    The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.

  8. Safety Assessment of Synthetic Fluorphlogopite as Used in Cosmetics.

    PubMed

    Becker, Lillian C; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2015-01-01

    The Cosmetic Ingredient Review Expert Panel (the Panel) reviewed the safety of synthetic fluorphlogopite as used in cosmetics. Synthetic fluorphlogopite functions as a bulking agent and a viscosity-increasing agent. The Panel reviewed available animal and human data related to this ingredient along with a previous safety assessment of other magnesium silicates. The Panel concluded that synthetic fluorphlogopite was safe as cosmetic ingredients in the practices of use and concentration as given in this safety assessment. © The Author(s) 2015.

  9. A novel safety assessment strategy applied to non-selective extracts.

    PubMed

    Koster, Sander; Leeman, Winfried; Verheij, Elwin; Dutman, Ellen; van Stee, Leo; Nielsen, Lene Munch; Ronsmans, Stefan; Noteborn, Hub; Krul, Lisette

    2015-06-01

    A main challenge in food safety research is to demonstrate that processing of foodstuffs does not lead to the formation of substances for which the safety upon consumption might be questioned. This is especially so since food is a complex matrix in which the analytical detection of substances, and consequent risk assessment thereof, is difficult to determine. Here, a pragmatic novel safety assessment strategy is applied to the production of non-selective extracts (NSEs), used for different purposes in food such as for colouring purposes, which are complex food mixtures prepared from reference juices. The Complex Mixture Safety Assessment Strategy (CoMSAS) is an exposure driven approach enabling to efficiently assess the safety of the NSE by focussing on newly formed substances or substances that may increase in exposure during the processing of the NSE. CoMSAS enables to distinguish toxicologically relevant from toxicologically less relevant substances, when related to their respective levels of exposure. This will reduce the amount of work needed for identification, characterisation and safety assessment of unknown substances detected at low concentration, without the need for toxicity testing using animal studies. In this paper, the CoMSAS approach has been applied for elderberry and pumpkin NSEs used for food colouring purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. An end-to-end assessment of range uncertainty in proton therapy using animal tissues.

    PubMed

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-21

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams' superior dose advantage over conventional photon-based radiation therapy.

  11. Mine safety assessment using gray relational analysis and bow tie model

    PubMed Central

    2018-01-01

    Mine safety assessment is a precondition for ensuring orderly and safety in production. The main purpose of this study was to prevent mine accidents more effectively by proposing a composite risk analysis model. First, the weights of the assessment indicators were determined by the revised integrated weight method, in which the objective weights were determined by a variation coefficient method and the subjective weights determined by the Delphi method. A new formula was then adopted to calculate the integrated weights based on the subjective and objective weights. Second, after the assessment indicator weights were determined, gray relational analysis was used to evaluate the safety of mine enterprises. Mine enterprise safety was ranked according to the gray relational degree, and weak links of mine safety practices identified based on gray relational analysis. Third, to validate the revised integrated weight method adopted in the process of gray relational analysis, the fuzzy evaluation method was used to the safety assessment of mine enterprises. Fourth, for first time, bow tie model was adopted to identify the causes and consequences of weak links and allow corresponding safety measures to be taken to guarantee the mine’s safe production. A case study of mine safety assessment was presented to demonstrate the effectiveness and rationality of the proposed composite risk analysis model, which can be applied to other related industries for safety evaluation. PMID:29561875

  12. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select

  13. Mountain torrents: Quantifying vulnerability and assessing uncertainties

    PubMed Central

    Totschnig, Reinhold; Fuchs, Sven

    2013-01-01

    Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696

  14. Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models

    NASA Astrophysics Data System (ADS)

    Chang, Joseph C.

    This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on

  15. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  16. Assessing the Risks to Human Health in Heterogeneous Aquifers under Uncertainty

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe

    2015-04-01

    Reliable quantification of human health risk from toxic chemicals present in groundwater is a challenging task. The main difficulty relies on the fact that many of the components that constitute human health risk assessment are uncertain and requires interdisciplinary knowledge. Understanding the impact from each of these components in risk estimation can provide guidance for decision makers to manage contaminated sites and best allocate resources towards minimal prediction uncertainty. This presentation will focus on the impact of aquifer heterogeneity in human health risk. Spatial heterogeneity of the hydrogeological properties can lead to the formation of preferential flow channels which control the plume spreading rates and travel time statistics, both which are critical in assessing the risk level. By making use of an integrated hydrogeological-health stochastic framework, the significance of characteristic length scales (e.g. characterizing flow, transport and sampling devices) in both controlling the uncertainty of health risk and determining data needs is highlighted. Through a series of examples, we show how fundamental knowledge on the main physical mechanisms affecting solute pathways are necessary to understand the human health response to varying drivers.

  17. The practice of pre-marketing safety assessment in drug development.

    PubMed

    Chuang-Stein, Christy; Xia, H Amy

    2013-01-01

    The last 15 years have seen a substantial increase in efforts devoted to safety assessment by statisticians in the pharmaceutical industry. While some of these efforts were driven by regulations and public demand for safer products, much of the motivation came from the realization that there is a strong need for a systematic approach to safety planning, evaluation, and reporting at the program level throughout the drug development life cycle. An efficient process can help us identify safety signals early and afford us the opportunity to develop effective risk minimization plan early in the development cycle. This awareness has led many pharmaceutical sponsors to set up internal systems and structures to effectively conduct safety assessment at all levels (patient, study, and program). In addition to process, tools have emerged that are designed to enhance data review and pattern recognition. In this paper, we describe advancements in the practice of safety assessment during the premarketing phase of drug development. In particular, we share examples of safety assessment practice at our respective companies, some of which are based on recommendations from industry-initiated working groups on best practice in recent years.

  18. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  19. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability riskmore » metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.« less

  20. Bayesian-network-based safety risk assessment for steel construction projects.

    PubMed

    Leu, Sou-Sen; Chang, Ching-Miao

    2013-05-01

    There are four primary accident types at steel building construction (SC) projects: falls (tumbles), object falls, object collapse, and electrocution. Several systematic safety risk assessment approaches, such as fault tree analysis (FTA) and failure mode and effect criticality analysis (FMECA), have been used to evaluate safety risks at SC projects. However, these traditional methods ineffectively address dependencies among safety factors at various levels that fail to provide early warnings to prevent occupational accidents. To overcome the limitations of traditional approaches, this study addresses the development of a safety risk-assessment model for SC projects by establishing the Bayesian networks (BN) based on fault tree (FT) transformation. The BN-based safety risk-assessment model was validated against the safety inspection records of six SC building projects and nine projects in which site accidents occurred. The ranks of posterior probabilities from the BN model were highly consistent with the accidents that occurred at each project site. The model accurately provides site safety-management abilities by calculating the probabilities of safety risks and further analyzing the causes of accidents based on their relationships in BNs. In practice, based on the analysis of accident risks and significant safety factors, proper preventive safety management strategies can be established to reduce the occurrence of accidents on SC sites. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  2. Extended time-to-collision measures for road traffic safety assessment.

    PubMed

    Minderhoud, M M; Bovy, P H

    2001-01-01

    This article describes two new safety indicators based on the time-to-collision notion suitable for comparative road traffic safety analyses. Such safety indicators can be applied in the comparison of a do-nothing case with an adapted situation, e.g. the introduction of intelligent driver support systems. In contrast to the classical time-to-collision value, measured at a cross section, the improved safety indicators use vehicle trajectories collected over a specific time horizon for a certain roadway segment to calculate the overall safety indicator value. Vehicle-specific indicator values as well as safety-critical probabilities can easily be determined from the developed safety measures. Application of the derived safety indicators is demonstrated for the assessment of the potential safety impacts of driver support systems from which it appears that some Autonomous Intelligent Cruise Control (AICC) designs are more safety-critical than the reference case without these systems. It is suggested that the indicator threshold value to be applied in the safety assessment has to be adapted when advanced AICC-systems with safe characteristics are introduced.

  3. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    PubMed

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  4. Probability and Confidence Trade-Space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter-Journet, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    Purpose of presentation: (1) Status update on the developing methodology to revise sub-system sparing targets. (2) To describe how to incorporate uncertainty into spare assessments and why it is important to do so (3) Demonstrate hardware risk postures through PACT evaluation

  5. Safety analysis and review system (SARS) assessment report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, E.T.

    1981-03-01

    Under DOE Order 5481.1, Safety Analysis and Review System for DOE Operations, safety analyses are required for DOE projects in order to ensure that: (1) potential hazards are systematically identified; (2) potential impacts are analyzed; (3) reasonable measures have been taken to eliminate, control, or mitigate the hazards; and (4) there is documented management authorization of the DOE operation based on an objective assessment of the adequacy of the safety analysis. This report is intended to provide the DOE Office of Plans and Technology Assessment (OPTA) with an independent evaluation of the adequacy of the ongoing safety analysis effort. Asmore » part of this effort, a number of site visits and interviews were conducted, and FE SARS documents were reviewed. The latter included SARS Implementation Plans for a number of FE field offices, as well as safety analysis reports completed for certain FE operations. This report summarizes SARS related efforts at the DOE field offices visited and evaluates the extent to which they fulfill the requirements of DOE 5481.1.« less

  6. Using cost-benefit concepts in design floods improves communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi

    2017-04-01

    Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs

  7. Recent advancements in GRACE mascon regularization and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Loomis, B. D.; Luthcke, S. B.

    2017-12-01

    The latest release of the NASA Goddard Space Flight Center (GSFC) global time-variable gravity mascon product applies a new regularization strategy along with new methods for estimating noise and leakage uncertainties. The critical design component of mascon estimation is the construction of the applied regularization matrices, and different strategies exist between the different centers that produce mascon solutions. The new approach from GSFC directly applies the pre-fit Level 1B inter-satellite range-acceleration residuals in the design of time-dependent regularization matrices, which are recomputed at each step of our iterative solution method. We summarize this new approach, demonstrating the simultaneous increase in recovered time-variable gravity signal and reduction in the post-fit inter-satellite residual magnitudes, until solution convergence occurs. We also present our new approach for estimating mascon noise uncertainties, which are calibrated to the post-fit inter-satellite residuals. Lastly, we present a new technique for end users to quickly estimate the signal leakage errors for any selected grouping of mascons, and we test the viability of this leakage assessment procedure on the mascon solutions produced by other processing centers.

  8. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasan, Sanjay

    2014-09-30

    In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models thatmore » reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir

  9. Use of the Home Safety Self-Assessment Tool (HSSAT) within Community Health Education to Improve Home Safety.

    PubMed

    Horowitz, Beverly P; Almonte, Tiffany; Vasil, Andrea

    2016-10-01

    This exploratory research examined the benefits of a health education program utilizing the Home Safety Self-Assessment Tool (HSSAT) to increase perceived knowledge of home safety, recognition of unsafe activities, ability to safely perform activities, and develop home safety plans of 47 older adults. Focus groups in two senior centers explored social workers' perspectives on use of the HSSAT in community practice. Results for the health education program found significant differences between reported knowledge of home safety (p = .02), ability to recognize unsafe activities (p = .01), safely perform activities (p = .04), and develop a safety plan (p = .002). Social workers identified home safety as a major concern and the HSSAT a promising assessment tool. Research has implications for reducing environmental fall risks.

  10. Selection of climate policies under the uncertainties in the Fifth Assessment Report of the IPCC

    NASA Astrophysics Data System (ADS)

    Drouet, L.; Bosetti, V.; Tavoni, M.

    2015-10-01

    Strategies for dealing with climate change must incorporate and quantify all the relevant uncertainties, and be designed to manage the resulting risks. Here we employ the best available knowledge so far, summarized by the three working groups of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5; refs , , ), to quantify the uncertainty of mitigation costs, climate change dynamics, and economic damage for alternative carbon budgets. We rank climate policies according to different decision-making criteria concerning uncertainty, risk aversion and intertemporal preferences. Our findings show that preferences over uncertainties are as important as the choice of the widely discussed time discount factor. Climate policies consistent with limiting warming to 2 °C above preindustrial levels are compatible with a subset of decision-making criteria and some model parametrizations, but not with the commonly adopted expected utility framework.

  11. [Agricultural biotechnology safety assessment].

    PubMed

    McClain, Scott; Jones, Wendelyn; He, Xiaoyun; Ladics, Gregory; Bartholomaeus, Andrew; Raybould, Alan; Lutter, Petra; Xu, Haibin; Wang, Xue

    2015-01-01

    Genetically modified (GM) crops were first introduced to farmers in 1995 with the intent to provide better crop yield and meet the increasing demand for food and feed. GM crops have evolved to include a thorough safety evaluation for their use in human food and animal feed. Safety considerations begin at the level of DNA whereby the inserted GM DNA is evaluated for its content, position and stability once placed into the crop genome. The safety of the proteins coded by the inserted DNA and potential effects on the crop are considered, and the purpose is to ensure that the transgenic novel proteins are safe from a toxicity, allergy, and environmental perspective. In addition, the grain that provides the processed food or animal feed is also tested to evaluate its nutritional content and identify unintended effects to the plant composition when warranted. To provide a platform for the safety assessment, the GM crop is compared to non-GM comparators in what is typically referred to as composition equivalence testing. New technologies, such as mass spectrometry and well-designed antibody-based methods, allow better analytical measurements of crop composition, including endogenous allergens. Many of the analytical methods and their intended uses are based on regulatory guidance documents, some of which are outlined in globally recognized documents such as Codex Alimentarius. In certain cases, animal models are recommended by some regulatory agencies in specific countries, but there is typically no hypothesis or justification of their use in testing the safety of GM crops. The quality and standardization of testing methods can be supported, in some cases, by employing good laboratory practices (GLP) and is recognized in China as important to ensure quality data. Although the number of recommended, in some cases, required methods for safety testing are increasing in some regulatory agencies, it should be noted that GM crops registered to date have been shown to be

  12. Comprehensive safety management and assessment at rugby football competitions.

    PubMed

    Tajima, T; Chosa, E; Kawahara, K; Nakamura, Y; Yoshikawa, D; Yamaguchi, N; Kashiwagi, T

    2014-11-01

    The present study aims to improve medical systems by designing objective safety assessment criteria for rugby competitions. We evaluated 195 competitions between 2002 and 2011 using an original safety scale comprising the following sections: 1) competence of staff such as referees, medical attendants and match day doctor; 2) environment such as weather, wet bulb globe temperature and field conditions; and 3) emergency medical care systems at the competitions. Each section was subdivided into groups A, B and C according to good, normal or fair degrees of safety determined by combinations of the results.Overall safety was assessed as A, B and C for 110, 78 and 7 competitions, respectively. The assessments of individual major factors were mostly favorable for staff, but the environment and medical care systems were assessed as C in 25 and 70, respectively, of the 195 competitions. Medical management involves not having a match day doctor, but also comprehensive management including preventive factors and responses from the staff, environment and medical-care systems. 6 cases of severe injuries and accidents occurred between 2002 and 2011, which were observed in Grade A competition. These cases revealed better prognosis without obvious impairment, thus confirming the value of the present assessment scale. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Chen, F.; Gan, Y.

    2017-12-01

    Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region Guo Zhang1, Fei Chen1,2, Yanjun Gan11State Key Laboratory of Severe Weather, Chinese Academy of Meteorological Sciences, Beijing, China 2National Center for Atmospheric Research, Boulder, Colorado, USA Uncertainties in the Noah with multiparameterization (Noah-MP) land surface model were assessed through physics ensemble simulations for four sparsely-vegetated sites located in the Tibetan Plateau region. Those simulations were evaluated using observations at the four sites during the third Tibetan Plateau Experiment (TIPEX III).The impacts of uncertainties in precipitation data used as forcing conditions, parameterizations of sub-processes such as soil organic matter and rhizosphere on physics-ensemble simulations are identified using two different methods: the natural selection and Tukey's test. This study attempts to answer the following questions: 1) what is the relative contribution of precipitation-forcing uncertainty to the overall uncertainty range of Noah-MP simulations at those sites as compared to that at a more moisture and densely vegetated site; 2) what are the most sensitive physical parameterization for those sites; 3) can we identify the parameterizations that need to be improved? The investigation was conducted by evaluating simulated seasonal evolution of soil temperature, soilmoisture, surface heat fluxes through a number of Noah-MP ensemble simulations.

  14. Safety Assessment of Alkyl Esters as Used in Cosmetics.

    PubMed

    Fiume, Monice M; Heldreth, Bart A; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2015-09-01

    The Cosmetic Ingredient Review Expert Panel (Panel) assessed the safety of 237 alkyl esters for use in cosmetics. The alkyl esters included in this assessment have a variety of reported functions in cosmetics, with skin-conditioning agent being the most common function. The Panel reviewed available animal and clinical data in making its determination of safety on these ingredients, and where there were data gaps, similarity in structure, properties, functions, and uses of these ingredients allowed for extrapolation of the available toxicological data to assess the safety of the entire group. The Panel concluded that these ingredients are safe in cosmetic formulations in the present practices of use and concentration when formulated to be nonirritating. © The Author(s) 2015.

  15. Safety assessment in plant layout design using indexing approach: implementing inherent safety perspective. Part 1 - guideword applicability and method description.

    PubMed

    Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2008-12-15

    Layout planning plays a key role in the inherent safety performance of process plants since this design feature controls the possibility of accidental chain-events and the magnitude of possible consequences. A lack of suitable methods to promote the effective implementation of inherent safety in layout design calls for the development of new techniques and methods. In the present paper, a safety assessment approach suitable for layout design in the critical early phase is proposed. The concept of inherent safety is implemented within this safety assessment; the approach is based on an integrated assessment of inherent safety guideword applicability within the constraints typically present in layout design. Application of these guidewords is evaluated along with unit hazards and control devices to quantitatively map the safety performance of different layout options. Moreover, the economic aspects related to safety and inherent safety are evaluated by the method. Specific sub-indices are developed within the integrated safety assessment system to analyze and quantify the hazard related to domino effects. The proposed approach is quick in application, auditable and shares a common framework applicable in other phases of the design lifecycle (e.g. process design). The present work is divided in two parts: Part 1 (current paper) presents the application of inherent safety guidelines in layout design and the index method for safety assessment; Part 2 (accompanying paper) describes the domino hazard sub-index and demonstrates the proposed approach with a case study, thus evidencing the introduction of inherent safety features in layout design.

  16. Assessing the importance of rainfall uncertainty on hydrological models with different spatial and temporal scale

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2015-04-01

    Precipitation is one of the key inputs for hydrological models. As long as the values of the hydrological model parameters are fixed, a variation of the rainfall input is expected to induce a change in the model output. Given the increased awareness of uncertainty on rainfall records, it becomes more important to understand the impact of this input - output dynamic. Yet, modellers often still have the intention to mimic the observed flow, whatever the deviation of the employed records from the actual rainfall might be, by recklessly adapting the model parameter values. But is it actually possible to vary the model parameter values in such a way that a certain (observed) model output can be generated based on inaccurate rainfall inputs? Thus, how important is the rainfall uncertainty for the model output with respect to the model parameter importance? To address this question, we apply the Sobol' sensitivity analysis method to assess and compare the importance of the rainfall uncertainty and the model parameters on the output of the hydrological model. In order to be able to treat the regular model parameters and input uncertainty in the same way, and to allow a comparison of their influence, a possible approach is to represent the rainfall uncertainty by a parameter. To tackle the latter issue, we apply so called rainfall multipliers on hydrological independent storm events, as a probabilistic parameter representation of the possible rainfall variation. As available rainfall records are very often point measurements at a discrete time step (hourly, daily, monthly,…), they contain uncertainty due to a latent lack of spatial and temporal variability. The influence of the latter variability can also be different for hydrological models with different spatial and temporal scale. Therefore, we perform the sensitivity analyses on a semi-distributed model (SWAT) and a lumped model (NAM). The assessment and comparison of the importance of the rainfall uncertainty and the

  17. Prediction uncertainty and data worth assessment for groundwater transport times in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Zell, Wesley O.; Culver, Teresa B.; Sanford, Ward E.

    2018-06-01

    Uncertainties about the age of base-flow discharge can have serious implications for the management of degraded environmental systems where subsurface pathways, and the ongoing release of pollutants that accumulated in the subsurface during past decades, dominate the water quality signal. Numerical groundwater models may be used to estimate groundwater return times and base-flow ages and thus predict the time required for stakeholders to see the results of improved agricultural management practices. However, the uncertainty inherent in the relationship between (i) the observations of atmospherically-derived tracers that are required to calibrate such models and (ii) the predictions of system age that the observations inform have not been investigated. For example, few if any studies have assessed the uncertainty of numerically-simulated system ages or evaluated the uncertainty reductions that may result from the expense of collecting additional subsurface tracer data. In this study we combine numerical flow and transport modeling of atmospherically-derived tracers with prediction uncertainty methods to accomplish four objectives. First, we show the relative importance of head, discharge, and tracer information for characterizing response times in a uniquely data rich catchment that includes 266 age-tracer measurements (SF6, CFCs, and 3H) in addition to long term monitoring of water levels and stream discharge. Second, we calculate uncertainty intervals for model-simulated base-flow ages using both linear and non-linear methods, and find that the prediction sensitivity vector used by linear first-order second-moment methods results in much larger uncertainties than non-linear Monte Carlo methods operating on the same parameter uncertainty. Third, by combining prediction uncertainty analysis with multiple models of the system, we show that data-worth calculations and monitoring network design are sensitive to variations in the amount of water leaving the system via

  18. Safety Assessment of Pentaerythrityl Tetraesters as Used in Cosmetics.

    PubMed

    Becker, Lillian C; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2015-09-01

    The Cosmetic Ingredient Review (CIR) Expert Panel (Panel) reviewed the safety of 16 pentaerythrityl tetraester compounds as used in cosmetics. These ingredients mostly function as hair-conditioning agents, skin-conditioning agents-miscellaneous and binders, skin-conditioning agents-occlusive, viscosity-increasing agents-nonaqueous, and skin-conditioning agents-emollient. The Panel reviewed the available animal and human data related to these ingredients and previous safety assessments of the fatty acid moieties. The Panel concluded that pentaerythrityl tetraisostearate and the other pentaerythrityl tetraester compounds were safe in the practices of use and concentration as given in this safety assessment. © The Author(s) 2015.

  19. Assessing theoretical uncertainties in fission barriers of superheavy nuclei

    DOE PAGES

    Agbemava, S. E.; Afanasjev, A. V.; Ray, D.; ...

    2017-05-26

    Here, theoretical uncertainties in the predictions of inner fission barrier heights in superheavy elements have been investigated in a systematic way for a set of state-of-the-art covariant energy density functionals which represent major classes of the functionals used in covariant density functional theory. They differ in basic model assumptions and fitting protocols. Both systematic and statistical uncertainties have been quantified where the former turn out to be larger. Systematic uncertainties are substantial in superheavy elements and their behavior as a function of proton and neutron numbers contains a large random component. The benchmarking of the functionals to the experimental datamore » on fission barriers in the actinides allows to reduce the systematic theoretical uncertainties for the inner fission barriers of unknown superheavy elements. However, even then they on average increase on moving away from the region where benchmarking has been performed. In addition, a comparison with the results of non-relativistic approaches is performed in order to define full systematic theoretical uncertainties over the state-of-the-art models. Even for the models benchmarked in the actinides, the difference in the inner fission barrier height of some superheavy elements reaches $5-6$ MeV. This uncertainty in the fission barrier heights will translate into huge (many tens of the orders of magnitude) uncertainties in the spontaneous fission half-lives.« less

  20. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk

    2016-06-08

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  1. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2016-06-01

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  2. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  3. Assessing risk of baleen whale hearing loss from seismic surveys: The effect of uncertainty and individual variation.

    PubMed

    Gedamke, Jason; Gales, Nick; Frydman, Sascha

    2011-01-01

    The potential for seismic airgun "shots" to cause acoustic trauma in marine mammals is poorly understood. There are just two empirical measurements of temporary threshold shift (TTS) onset levels from airgun-like sounds in odontocetes. Considering these limited data, a model was developed examining the impact of individual variability and uncertainty on risk assessment of baleen whale TTS from seismic surveys. In each of 100 simulations: 10000 "whales" are assigned TTS onset levels accounting for: inter-individual variation; uncertainty over the population's mean; and uncertainty over weighting of odontocete data to obtain baleen whale onset levels. Randomly distributed whales are exposed to one seismic survey passage with cumulative exposure level calculated. In the base scenario, 29% of whales (5th/95th percentiles of 10%/62%) approached to 1-1.2 km range were exposed to levels sufficient for TTS onset. By comparison, no whales are at risk outside 0.6 km when uncertainty and variability are not considered. Potentially "exposure altering" parameters (movement, avoidance, surfacing, and effective quiet) were also simulated. Until more research refines model inputs, the results suggest a reasonable likelihood that whales at a kilometer or more from seismic surveys could potentially be susceptible to TTS and demonstrate that the large impact uncertainty and variability can have on risk assessment.

  4. Processes of technology assessment: The National Transportation Safety Board

    NASA Technical Reports Server (NTRS)

    Weiss, E.

    1972-01-01

    The functions and operations of the Safety Board as related to technology assessment are described, and a brief history of the Safety Board is given. Recommendations made for safety in all areas of transportation and the actions taken are listed. Although accident investigation is an important aspect of NTSB's activity, it is felt that the greatest contribution is in pressing for development of better accident prevention programs. Efforts of the Safety Board in changing transportation technology to improve safety and prevent accidents are illustrated.

  5. Addressing Uncertainty in the ISCORS Multimedia Radiological Dose Assessment of Municipal Sewage Sludge and Ash

    NASA Astrophysics Data System (ADS)

    Chiu, W. A.; Bachmaier, J.; Bastian, R.; Hogan, R.; Lenhart, T.; Schmidt, D.; Wolbarst, A.; Wood, R.; Yu, C.

    2002-05-01

    Managing municipal wastewater at publicly owned treatment works (POTWs) leads to the production of considerable amounts of residual solid material, which is known as sewage sludge or biosolids. If the wastewater entering a POTW contains radioactive material, then the treatment process may concentrate radionuclides in the sludge, leading to possible exposure of the general public or the POTW workers. The Sewage Sludge Subcommittee of the Interagency Steering Committee on Radiation Standards (ISCORS), which consists of representatives from the Environmental Protection Agency, the Nuclear Regulatory Commission, the Department of Energy, and several other federal, state, and local agencies, is developing guidance for POTWs on the management of sewage sludge that may contain radioactive materials. As part of this effort, they are conducting an assessment of potential radiation exposures using the Department of Energy's RESidual RADioactivity (RESRAD) family of computer codes developed by Argonne National Laboratory. This poster describes several approaches used by the Subcommittee to address the uncertainties associated with their assessment. For instance, uncertainties in the source term are addressed through a combination of analytic and deterministic computer code calculations. Uncertainties in the exposure pathways are addressed through the specification of a number of hypothetical scenarios, some of which can be scaled to address changes in exposure parameters. In addition, the uncertainty in some physical and behavioral parameters are addressed through probabilistic methods.

  6. Safety Assessment of Talc as Used in Cosmetics.

    PubMed

    Fiume, Monice M; Boyer, Ivan; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2015-01-01

    The Cosmetic Ingredient Review Expert Panel (Panel) assessed the safety of talc for use in cosmetics. The safety of talc has been the subject of much debate through the years, partly because the relationship between talc and asbestos is commonly misunderstood. Industry specifications state that cosmetic-grade talc must contain no detectable fibrous, asbestos minerals. Therefore, the large amount of available animal and clinical data the Panel relied on in assessing the safety of talc only included those studies on talc that did not contain asbestos. The Panel concluded that talc is safe for use in cosmetics in the present practices of use and concentration (some cosmetic products are entirely composed of talc). Talc should not be applied to the skin when the epidermal barrier is missing or significantly disrupted. © The Author(s) 2015.

  7. Matrix approach to uncertainty assessment and reduction for modeling terrestrial carbon cycle

    NASA Astrophysics Data System (ADS)

    Luo, Y.; Xia, J.; Ahlström, A.; Zhou, S.; Huang, Y.; Shi, Z.; Wang, Y.; Du, Z.; Lu, X.

    2017-12-01

    Terrestrial ecosystems absorb approximately 30% of the anthropogenic carbon dioxide emissions. This estimate has been deduced indirectly: combining analyses of atmospheric carbon dioxide concentrations with ocean observations to infer the net terrestrial carbon flux. In contrast, when knowledge about the terrestrial carbon cycle is integrated into different terrestrial carbon models they make widely different predictions. To improve the terrestrial carbon models, we have recently developed a matrix approach to uncertainty assessment and reduction. Specifically, the terrestrial carbon cycle has been commonly represented by a series of carbon balance equations to track carbon influxes into and effluxes out of individual pools in earth system models. This representation matches our understanding of carbon cycle processes well and can be reorganized into one matrix equation without changing any modeled carbon cycle processes and mechanisms. We have developed matrix equations of several global land C cycle models, including CLM3.5, 4.0 and 4.5, CABLE, LPJ-GUESS, and ORCHIDEE. Indeed, the matrix equation is generic and can be applied to other land carbon models. This matrix approach offers a suite of new diagnostic tools, such as the 3-dimensional (3-D) parameter space, traceability analysis, and variance decomposition, for uncertainty analysis. For example, predictions of carbon dynamics with complex land models can be placed in a 3-D parameter space (carbon input, residence time, and storage potential) as a common metric to measure how much model predictions are different. The latter can be traced to its source components by decomposing model predictions to a hierarchy of traceable components. Then, variance decomposition can help attribute the spread in predictions among multiple models to precisely identify sources of uncertainty. The highly uncertain components can be constrained by data as the matrix equation makes data assimilation computationally possible. We

  8. Safety assessment methodology in management of spent sealed sources.

    PubMed

    Mahmoud, Narmine Salah

    2005-02-14

    Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.

  9. 76 FR 74723 - New Car Assessment Program (NCAP); Safety Labeling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... DEPARTMENT OF TRANSPORTATION National Highway Traffic Safety Administration 49 CFR Part 575 [Docket No. NHTSA 2010-0025] RIN 2127-AK51 New Car Assessment Program (NCAP); Safety Labeling AGENCY: National Highway Traffic Safety Administration (NHTSA), Department of Transportation (DOT). ACTION...

  10. High-Throughput Toxicity Testing: New Strategies for Assessing Chemical Safety

    EPA Science Inventory

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct add...

  11. Assessment of Safety Standards for Automotive Electronic Control Systems

    DOT National Transportation Integrated Search

    2016-06-01

    This report summarizes the results of a study that assessed and compared six industry and government safety standards relevant to the safety and reliability of automotive electronic control systems. These standards include ISO 26262 (Road Vehicles - ...

  12. Why the Eurocontrol Safety Regulation Commission Policy on Safety Nets and Risk Assessment is Wrong

    NASA Astrophysics Data System (ADS)

    Brooker, Peter

    2004-05-01

    Current Eurocontrol Safety Regulation Commission (SRC) policy says that the Air Traffic Management (ATM) system (including safety minima) must be demonstrated through risk assessments to meet the Target Level of Safety (TLS) without needing to take safety nets (such as Short Term Conflict Alert) into account. This policy is wrong. The policy is invalid because it does not build rationally and consistently from ATM's firm foundations of TLS and hazard analysis. The policy is bad because it would tend to retard safety improvements. Safety net policy must rest on a clear and rational treatment of integrated ATM system safety defences. A new safety net policy, appropriate to safe ATM system improvements, is needed, which recognizes that safety nets are an integrated part of ATM system defences. The effects of safety nets in reducing deaths from mid-air collisions should be fully included in hazard analysis and safety audits in the context of the TLS for total system design.

  13. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  14. Global assessment of water policy vulnerability under uncertainty in water scarcity projections

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Kahil, Taher; Satoh, Yusuke; Burek, Peter; Fischer, Günther; Tramberend, Sylvia; Byers, Edward; Flörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Langan, Simon; Wada, Yoshihide

    2017-04-01

    Water scarcity is a critical environmental issue worldwide, which has been driven by the significant increase in water extractions during the last century. In the coming decades, climate change is projected to further exacerbate water scarcity conditions in many regions around the world. At present, one important question for policy debate is the identification of water policy interventions that could address the mounting water scarcity problems. Main interventions include investing in water storage infrastructures, water transfer canals, efficient irrigation systems, and desalination plants, among many others. This type of interventions involve long-term planning, long-lived investments and some irreversibility in choices which can shape development of countries for decades. Making decisions on these water infrastructures requires anticipating the long term environmental conditions, needs and constraints under which they will function. This brings large uncertainty in the decision-making process, for instance from demographic or economic projections. But today, climate change is bringing another layer of uncertainty that make decisions even more complex. In this study, we assess in a probabilistic approach the uncertainty in global water scarcity projections following different socioeconomic pathways (SSPs) and climate scenarios (RCPs) within the first half of the 21st century. By utilizing an ensemble of 45 future water scarcity projections based on (i) three state-of-the-art global hydrological models (PCR-GLOBWB, H08, and WaterGAP), (ii) five climate models, and (iii) three water scenarios, we have assessed changes in water scarcity and the associated uncertainty distribution worldwide. The water scenarios used here are developed by IIASA's Water Futures and Solutions (WFaS) Initiative. The main objective of this study is to improve the contribution of hydro-climatic information to effective policymaking by identifying spatial and temporal policy

  15. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  16. Safety Sufficiency for NextGen: Assessment of Selected Existing Safety Methods, Tools, Processes, and Regulations

    NASA Technical Reports Server (NTRS)

    Xu, Xidong; Ulrey, Mike L.; Brown, John A.; Mast, James; Lapis, Mary B.

    2013-01-01

    NextGen is a complex socio-technical system and, in many ways, it is expected to be more complex than the current system. It is vital to assess the safety impact of the NextGen elements (technologies, systems, and procedures) in a rigorous and systematic way and to ensure that they do not compromise safety. In this study, the NextGen elements in the form of Operational Improvements (OIs), Enablers, Research Activities, Development Activities, and Policy Issues were identified. The overall hazard situation in NextGen was outlined; a high-level hazard analysis was conducted with respect to multiple elements in a representative NextGen OI known as OI-0349 (Automation Support for Separation Management); and the hazards resulting from the highly dynamic complexity involved in an OI-0349 scenario were illustrated. A selected but representative set of the existing safety methods, tools, processes, and regulations was then reviewed and analyzed regarding whether they are sufficient to assess safety in the elements of that OI and ensure that safety will not be compromised and whether they might incur intolerably high costs.

  17. Assessment of Uncertainty in the Determination of Activation Energy for Polymeric Materials

    NASA Technical Reports Server (NTRS)

    Darby, Stephania P.; Landrum, D. Brian; Coleman, Hugh W.

    1998-01-01

    An assessment of the experimental uncertainty in obtaining the kinetic activation energy from thermogravimetric analysis (TGA) data is presented. A neat phenolic resin, Borden SC1O08, was heated at three heating rates to obtain weight loss vs temperature data. Activation energy was calculated by two methods: the traditional Flynn and Wall method based on the slope of log(q) versus 1/T, and a modification of this method where the ordinate and abscissa are reversed in the linear regression. The modified method produced a more accurate curve fit of the data, was more sensitive to data nonlinearity, and gave a value of activation energy 75 percent greater than the original method. An uncertainty analysis using the modified method yielded a 60 percent uncertainty in the average activation energy. Based on this result, the activation energy for a carbon-phenolic material was doubled and used to calculate the ablation rate In a typical solid rocket environment. Doubling the activation energy increased surface recession by 3 percent. Current TGA data reduction techniques that use the traditional Flynn and Wall approach to calculate activation energy should be changed to the modified method.

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  19. Measuring Best Practices for Workplace Safety, Health, and Well-Being: The Workplace Integrated Safety and Health Assessment.

    PubMed

    Sorensen, Glorian; Sparer, Emily; Williams, Jessica A R; Gundersen, Daniel; Boden, Leslie I; Dennerlein, Jack T; Hashimoto, Dean; Katz, Jeffrey N; McLellan, Deborah L; Okechukwu, Cassandra A; Pronk, Nicolaas P; Revette, Anna; Wagner, Gregory R

    2018-05-01

    To present a measure of effective workplace organizational policies, programs, and practices that focuses on working conditions and organizational facilitators of worker safety, health and well-being: the workplace integrated safety and health (WISH) assessment. Development of this assessment used an iterative process involving a modified Delphi method, extensive literature reviews, and systematic cognitive testing. The assessment measures six core constructs identified as central to best practices for protecting and promoting worker safety, health and well-being: leadership commitment; participation; policies, programs, and practices that foster supportive working conditions; comprehensive and collaborative strategies; adherence to federal and state regulations and ethical norms; and data-driven change. The WISH Assessment holds promise as a tool that may inform organizational priority setting and guide research around causal pathways influencing implementation and outcomes related to these approaches.

  20. Quantitative risk assessment of CO2 transport by pipelines--a review of uncertainties and their impacts.

    PubMed

    Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André

    2010-05-15

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  1. NASA Aviation Safety Program Systems Analysis/Program Assessment Metrics Review

    NASA Technical Reports Server (NTRS)

    Louis, Garrick E.; Anderson, Katherine; Ahmad, Tisan; Bouabid, Ali; Siriwardana, Maya; Guilbaud, Patrick

    2003-01-01

    The goal of this project is to evaluate the metrics and processes used by NASA's Aviation Safety Program in assessing technologies that contribute to NASA's aviation safety goals. There were three objectives for reaching this goal. First, NASA's main objectives for aviation safety were documented and their consistency was checked against the main objectives of the Aviation Safety Program. Next, the metrics used for technology investment by the Program Assessment function of AvSP were evaluated. Finally, other metrics that could be used by the Program Assessment Team (PAT) were identified and evaluated. This investigation revealed that the objectives are in fact consistent across organizational levels at NASA and with the FAA. Some of the major issues discussed in this study which should be further investigated, are the removal of the Cost and Return-on-Investment metrics, the lack of the metrics to measure the balance of investment and technology, the interdependencies between some of the metric risk driver categories, and the conflict between 'fatal accident rate' and 'accident rate' in the language of the Aviation Safety goal as stated in different sources.

  2. Patient safety competencies in undergraduate nursing students: a rapid evidence assessment.

    PubMed

    Bianchi, Monica; Bressan, Valentina; Cadorin, Lucia; Pagnucci, Nicola; Tolotti, Angela; Valcarenghi, Dario; Watson, Roger; Bagnasco, Annamaria; Sasso, Loredana

    2016-12-01

    To identify patient safety competencies, and determine the clinical learning environments that facilitate the development of patient safety competencies in nursing students. Patient safety in nursing education is of key importance for health professional environments, settings and care systems. To be effective, safe nursing practice requires a good integration between increasing knowledge and the different clinical practice settings. Nurse educators have the responsibility to develop effective learning processes and ensure patient safety. Rapid Evidence Assessment. MEDLINE, CINAHL, SCOPUS and ERIC were searched, yielding 500 citations published between 1 January 2004-30 September 2014. Following the Rapid Evidence Assessment process, 17 studies were included in this review. Hawker's (2002) quality assessment tool was used to assess the quality of the selected studies. Undergraduate nursing students need to develop competencies to ensure patient safety. The quality of the pedagogical atmosphere in the clinical setting has an important impact on the students' overall level of competence. Active student engagement in clinical processes stimulates their critical reasoning, improves interpersonal communication and facilitates adequate supervision and feedback. Few studies describe the nursing students' patient safety competencies and exactly what they need to learn. In addition, studies describe only briefly which clinical learning environments facilitate the development of patient safety competencies in nursing students. Further research is needed to identify additional pedagogical strategies and the specific characteristics of the clinical learning environments that encourage the development of nursing students' patient safety competencies. © 2016 John Wiley & Sons Ltd.

  3. Uncertainty and Variability in Physiologically-Based ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment.

  4. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  5. Technical note: Design flood under hydrological uncertainty

    NASA Astrophysics Data System (ADS)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  6. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression.

    PubMed

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  7. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression

    PubMed Central

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505

  8. Bayesian assessment of uncertainty in aerosol size distributions and index of refraction retrieved from multiwavelength lidar measurements.

    PubMed

    Herman, Benjamin R; Gross, Barry; Moshary, Fred; Ahmed, Samir

    2008-04-01

    We investigate the assessment of uncertainty in the inference of aerosol size distributions from backscatter and extinction measurements that can be obtained from a modern elastic/Raman lidar system with a Nd:YAG laser transmitter. To calculate the uncertainty, an analytic formula for the correlated probability density function (PDF) describing the error for an optical coefficient ratio is derived based on a normally distributed fractional error in the optical coefficients. Assuming a monomodal lognormal particle size distribution of spherical, homogeneous particles with a known index of refraction, we compare the assessment of uncertainty using a more conventional forward Monte Carlo method with that obtained from a Bayesian posterior PDF assuming a uniform prior PDF and show that substantial differences between the two methods exist. In addition, we use the posterior PDF formalism, which was extended to include an unknown refractive index, to find credible sets for a variety of optical measurement scenarios. We find the uncertainty is greatly reduced with the addition of suitable extinction measurements in contrast to the inclusion of extra backscatter coefficients, which we show to have a minimal effect and strengthens similar observations based on numerical regularization methods.

  9. Edible safety requirements and assessment standards for agricultural genetically modified organisms.

    PubMed

    Deng, Pingjian; Zhou, Xiangyang; Zhou, Peng; Du, Zhong; Hou, Hongli; Yang, Dongyan; Tan, Jianjun; Wu, Xiaojin; Zhang, Jinzhou; Yang, Yongcun; Liu, Jin; Liu, Guihua; Li, Yonghong; Liu, Jianjun; Yu, Lei; Fang, Shisong; Yang, Xiaoke

    2008-05-01

    This paper describes the background, principles, concepts and methods of framing the technical regulation for edible safety requirement and assessment of agricultural genetically modified organisms (agri-GMOs) for Shenzhen Special Economic Zone in the People's Republic of China. It provides a set of systematic criteria for edible safety requirements and the assessment process for agri-GMOs. First, focusing on the degree of risk and impact of different agri-GMOs, we developed hazard grades for toxicity, allergenicity, anti-nutrition effects, and unintended effects and standards for the impact type of genetic manipulation. Second, for assessing edible safety, we developed indexes and standards for different hazard grades of recipient organisms, for the influence of types of genetic manipulation and hazard grades of agri-GMOs. To evaluate the applicability of these criteria and their congruency with other safety assessment systems for GMOs applied by related organizations all over the world, we selected some agri-GMOs (soybean, maize, potato, capsicum and yeast) as cases to put through our new assessment system, and compared our results with the previous assessments. It turned out that the result of each of the cases was congruent with the original assessment.

  10. There is more to risk and safety planning than dramatic risks: Mental health nurses' risk assessment and safety-management practice.

    PubMed

    Higgins, Agnes; Doyle, Louise; Downes, Carmel; Morrissey, Jean; Costello, Paul; Brennan, Michael; Nash, Michael

    2016-04-01

    Risk assessment and safety planning are considered a cornerstone of mental health practice, yet limited research exists into how mental health nurses conceptualize 'risk' and how they engage with risk assessment and safety planning. The aim of the present study was to explore mental health nurses' practices and confidence in risk assessment and safety planning. A self-completed survey was administered to 381 mental health nurses in Ireland. The findings indicate that nurses focus on risk to self and risk to others, with the risk of suicide, self-harm, substance abuse, and violence being most frequently assessed. Risk from others and 'iatrogenic' risk were less frequently considered. Overall, there was limited evidence of recovery-oriented practice in relation to risk. The results demonstrate a lack of meaningful engagement with respect to collaborative safety planning, the identification and inclusion of protective factors, and the inclusion of positive risk-taking opportunities. In addition, respondents report a lack of confidence working with positive risk taking and involving family/carers in the risk-assessment and safety-planning process. Gaps in knowledge about risk-assessment and safety-planning practice, which could be addressed through education, are identified, as are the implications of the findings for practice and research. © 2015 Australian College of Mental Health Nurses Inc.

  11. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  12. Assessing concentration uncertainty estimates from passive microwave sea ice products

    NASA Astrophysics Data System (ADS)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  13. [Safety assessment of foods derived from genetically modified plants].

    PubMed

    Pöting, A; Schauzu, M

    2010-06-01

    The placing of genetically modified plants and derived food on the market falls under Regulation (EC) No. 1829/2003. According to this regulation, applicants need to perform a safety assessment according to the Guidance Document of the Scientific Panel on Genetically Modified Organisms of the European Food Safety Authority (EFSA), which is based on internationally agreed recommendations. This article gives an overview of the underlying legislation as well as the strategy and scientific criteria for the safety assessment, which should generally be based on the concept of substantial equivalence and carried out in relation to an unmodified conventional counterpart. Besides the intended genetic modification, potential unintended changes also have to be assessed with regard to potential adverse effects for the consumer. All genetically modified plants and derived food products, which have been evaluated by EFSA so far, were considered to be as safe as products derived from the respective conventional plants.

  14. Multi-model ensembles for assessment of flood losses and associated uncertainty

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Schröter, Kai; Weiss-Motz, Alexander; Martina, Mario L. V.; Kreibich, Heidi

    2018-05-01

    Flood loss modelling is a crucial part of risk assessments. However, it is subject to large uncertainty that is often neglected. Most models available in the literature are deterministic, providing only single point estimates of flood loss, and large disparities tend to exist among them. Adopting any one such model in a risk assessment context is likely to lead to inaccurate loss estimates and sub-optimal decision-making. In this paper, we propose the use of multi-model ensembles to address these issues. This approach, which has been applied successfully in other scientific fields, is based on the combination of different model outputs with the aim of improving the skill and usefulness of predictions. We first propose a model rating framework to support ensemble construction, based on a probability tree of model properties, which establishes relative degrees of belief between candidate models. Using 20 flood loss models in two test cases, we then construct numerous multi-model ensembles, based both on the rating framework and on a stochastic method, differing in terms of participating members, ensemble size and model weights. We evaluate the performance of ensemble means, as well as their probabilistic skill and reliability. Our results demonstrate that well-designed multi-model ensembles represent a pragmatic approach to consistently obtain more accurate flood loss estimates and reliable probability distributions of model uncertainty.

  15. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  16. Safety assessment of Vitis vinifera (grape)-derived ingredients as used in cosmetics.

    PubMed

    Fiume, Monice M; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2014-01-01

    The Cosmetic Ingredient Review Expert Panel (Panel) assessed the safety of 24 Vitis vinifera (grape)-derived ingredients and found them safe in the present practices of use and concentration in cosmetics. These ingredients function in cosmetics mostly as skin-conditioning agents, but some function as antioxidants, flavoring agents, and/or colorants. The Panel reviewed the available animal and clinical data to determine the safety of these ingredients. Additionally, some constituents of grapes have been assessed previously for safety as cosmetic ingredients by the Panel, and others are compounds that have been discussed in previous Panel safety assessments. © The Author(s) 2014.

  17. Transforming Medical Assessment: Integrating Uncertainty Into the Evaluation of Clinical Reasoning in Medical Education.

    PubMed

    Cooke, Suzette; Lemay, Jean-Francois

    2017-06-01

    In an age where practicing physicians have access to an overwhelming volume of clinical information and are faced with increasingly complex medical decisions, the ability to execute sound clinical reasoning is essential to optimal patient care. The authors propose two concepts that are philosophically paramount to the future assessment of clinical reasoning in medicine: assessment in the context of "uncertainty" (when, despite all of the information that is available, there is still significant doubt as to the best diagnosis, investigation, or treatment), and acknowledging that it is entirely possible (and reasonable) to have more than "one correct answer." The purpose of this article is to highlight key elements related to these two core concepts and discuss genuine barriers that currently exist on the pathway to creating such assessments. These include acknowledging situations of uncertainty, creating clear frameworks that define progressive levels of clinical reasoning skills, providing validity evidence to increase the defensibility of such assessments, considering the comparative feasibility with other forms of assessment, and developing strategies to evaluate the impact of these assessment methods on future learning and practice. The authors recommend that concerted efforts be directed toward these key areas to help advance the field of clinical reasoning assessment, improve the clinical care decisions made by current and future physicians, and have positive outcomes for patients. It is anticipated that these and subsequent efforts will aid in reaching the goal of making future assessment in medical education more representative of current-day clinical reasoning and decision making.

  18. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  19. Quantitative safety assessment of air traffic control systems through system control capacity

    NASA Astrophysics Data System (ADS)

    Guo, Jingjing

    Quantitative Safety Assessments (QSA) are essential to safety benefit verification and regulations of developmental changes in safety critical systems like the Air Traffic Control (ATC) systems. Effectiveness of the assessments is particularly desirable today in the safe implementations of revolutionary ATC overhauls like NextGen and SESAR. QSA of ATC systems are however challenged by system complexity and lack of accident data. Extending from the idea "safety is a control problem" in the literature, this research proposes to assess system safety from the control perspective, through quantifying a system's "control capacity". A system's safety performance correlates to this "control capacity" in the control of "safety critical processes". To examine this idea in QSA of the ATC systems, a Control-capacity Based Safety Assessment Framework (CBSAF) is developed which includes two control capacity metrics and a procedural method. The two metrics are Probabilistic System Control-capacity (PSC) and Temporal System Control-capacity (TSC); each addresses an aspect of a system's control capacity. And the procedural method consists three general stages: I) identification of safety critical processes, II) development of system control models and III) evaluation of system control capacity. The CBSAF was tested in two case studies. The first one assesses an en-route collision avoidance scenario and compares three hypothetical configurations. The CBSAF was able to capture the uncoordinated behavior between two means of control, as was observed in a historic midair collision accident. The second case study compares CBSAF with an existing risk based QSA method in assessing the safety benefits of introducing a runway incursion alert system. Similar conclusions are reached between the two methods, while the CBSAF has the advantage of simplicity and provides a new control-based perspective and interpretation to the assessments. The case studies are intended to investigate the

  20. Qalibra: a general model for food risk-benefit assessment that quantifies variability and uncertainty.

    PubMed

    Hart, Andy; Hoekstra, Jeljer; Owen, Helen; Kennedy, Marc; Zeilmaker, Marco J; de Jong, Nynke; Gunnlaugsdottir, Helga

    2013-04-01

    The EU project BRAFO proposed a framework for risk-benefit assessment of foods, or changes in diet, that present both potential risks and potential benefits to consumers (Hoekstra et al., 2012a). In higher tiers of the BRAFO framework, risks and benefits are integrated quantitatively to estimate net health impact measured in DALYs or QALYs (disability- or quality-adjusted life years). This paper describes a general model that was developed by a second EU project, Qalibra, to assist users in conducting these assessments. Its flexible design makes it applicable to a wide range of dietary questions involving different nutrients, contaminants and health effects. Account can be taken of variation between consumers in their diets and also other characteristics relevant to the estimation of risk and benefit, such as body weight, gender and age. Uncertainty in any input parameter may be quantified probabilistically, using probability distributions, or deterministically by repeating the assessment with alternative assumptions. Uncertainties that are not quantified should be evaluated qualitatively. Outputs produced by the model are illustrated using results from a simple assessment of fish consumption. More detailed case studies on oily fish and phytosterols are presented in companion papers. The model can be accessed as web-based software at www.qalibra.eu. Copyright © 2012. Published by Elsevier Ltd.

  1. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  2. Assessment of zero gravity effects on space worker health and safety

    NASA Technical Reports Server (NTRS)

    1980-01-01

    One objective of the study is to assess the effects of all currently known deviations from normal of medical, physiological, and biochemical parameters which appear to be due to zero gravity (zero-g) environment and to acceleration and deceleration to be experienced, as outlined in the references Solar Power Satellites (SPS) design, by space worker. Study results include identification of possible health or safety effects on space workers either immediate or delayed due to the zero gravity environment and acceleration and deceleration; estimation of the probability that an individual will be adversely affected; description of the possible consequence to work efficiency in persons adversely affected; and description of the possible/probable consequences to immediate and future health of individuals exposed to this environment. A research plan, which addresses the uncertainties in current knowledge regarding the health and safety hazards to exposed SPS space workers, is presented. Although most adverse affects experienced during space flight soon disappeared upon return to the Earth's environment, there remains a definite concern for the long-term effects to SPS space workers who might spend as much as half their time in space during a possible five year career period. The proposed 90 day up/90 day down cycle, coupled with the fact that most of the effects of weightlessness may persist throughout the flight along with the realization that recovery may occupy much of the terrestrial stay, may keep the SPS workers in a deviant physical condition or state of flux for 60 to 100% of their five year career.

  3. Assessing Uncertainties in Gridded Emissions: A Case Study for Fossil Fuel Carbon Dioxide (FFCO2) Emission Data

    NASA Technical Reports Server (NTRS)

    Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.

    2017-01-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar

  4. Assessing uncertainties in gridded emissions: A case study for fossil fuel carbon dioxide (FFCO2) emission data

    NASA Astrophysics Data System (ADS)

    Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.

    2017-12-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar

  5. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  6. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  7. Safety Assessment of Alumina and Aluminum Hydroxide as Used in Cosmetics.

    PubMed

    Becker, Lillian C; Boyer, Ivan; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2016-11-01

    This is a safety assessment of alumina and aluminum hydroxide as used in cosmetics. Alumina functions as an abrasive, absorbent, anticaking agent, bulking agent, and opacifying agent. Aluminum hydroxide functions as a buffering agent, corrosion inhibitor, and pH adjuster. The Food and Drug Administration (FDA) evaluated the safe use of alumina in several medical devices and aluminum hydroxide in over-the-counter drugs, which included a review of human and animal safety data. The Cosmetic Ingredient Review (CIR) Expert Panel considered the FDA evaluations as part of the basis for determining the safety of these ingredients as used in cosmetics. Alumina used in cosmetics is essentially the same as that used in medical devices. This safety assessment does not include metallic or elemental aluminum as a cosmetic ingredient. The CIR Expert Panel concluded that alumina and aluminum hydroxide are safe in the present practices of use and concentration described in this safety assessment. © The Author(s) 2016.

  8. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  9. Health and Safety Checklist for Early Care and Education Programs to Assess Key National Health and Safety Standards.

    PubMed

    Alkon, Abbey; Rose, Roberta; Wolff, Mimi; Kotch, Jonathan B; Aronson, Susan S

    2016-01-01

    The project aims were to (1) develop an observational Health and Safety Checklist to assess health and safety practices and conditions in early care and education (ECE) programs using Stepping Stones To Caring For Our Children, 3rd Edition national standards, (2) pilot test the Checklist, completed by nurse child care health consultants, to assess feasibility, ease of completion, objectivity, validity, and reliability, and (3) revise the Checklist based on the qualitative and quantitative results of the pilot study. The observable national health and safety standards were identified and then rated by health, safety, and child care experts using a Delphi technique to validate the standards as essential to prevent harm and promote health. Then, child care health consultants recruited ECE centers and pilot tested the 124-item Checklist. The pilot study was conducted in Arizona, California and North Carolina. The psychometric properties of the Checklist were assessed. The 37 participating ECE centers had 2627 children from ethnically-diverse backgrounds and primarily low-income families. The child care health consultants found the Checklist easy to complete, objective, and useful for planning health and safety interventions. The Checklist had content and face validity, inter-rater reliability, internal consistency, and concurrent validity. Based on the child care health consultant feedback and psychometric properties of the Checklist, the Checklist was revised and re-written at an 8th grade literacy level. The Health and Safety Checklist provides a standardized instrument of observable, selected national standards to assess the quality of health and safety in ECE centers.

  10. Multi-year assessment of soil-vegetation-atmosphere transfer (SVAT) modeling uncertainties over a Mediterranean agricultural site

    NASA Astrophysics Data System (ADS)

    Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.

    2012-04-01

    Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata

  11. Environmental assessment overview

    NASA Technical Reports Server (NTRS)

    Valentino, A. R.

    1980-01-01

    The assessment program has as its objectives: to identify the environmental issues associated with the SPS Reference System; to prepare a preliminary assessment based on existing data; to suggest mitigating strategies and provide environmental data and guidance to other components of the program as required; and to plan long-range research to reduce the uncertainty in the preliminary assessment. The key environmental issues associated with the satellite power system are discussed and include human health and safety, ecosystems, climate, and interaction with electromagnetic systems.

  12. Uncertainty propagation in life cycle assessment of biodiesel versus diesel: global warming and non-renewable energy.

    PubMed

    Hong, Jinglan

    2012-06-01

    Uncertainty information is essential for the proper use of life cycle assessment and environmental assessments in decision making. To investigate the uncertainties of biodiesel and determine the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel, an explicit analytical approach based on the Taylor series expansion for lognormal distribution was applied in the present study. A biodiesel case study demonstrates the probability that biodiesel has a lower global warming and non-renewable energy score than diesel, that is 92.3% and 93.1%, respectively. The results indicate the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel based on the global warming and non-renewable energy scores. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Numerical Simulation and Quantitative Uncertainty Assessment of Microchannel Flow

    NASA Astrophysics Data System (ADS)

    Debusschere, Bert; Najm, Habib; Knio, Omar; Matta, Alain; Ghanem, Roger; Le Maitre, Olivier

    2002-11-01

    This study investigates the effect of uncertainty in physical model parameters on computed electrokinetic flow of proteins in a microchannel with a potassium phosphate buffer. The coupled momentum, species transport, and electrostatic field equations give a detailed representation of electroosmotic and pressure-driven flow, including sample dispersion mechanisms. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. To quantify uncertainty, the governing equations are reformulated using a pseudo-spectral stochastic methodology, which uses polynomial chaos expansions to describe uncertain/stochastic model parameters, boundary conditions, and flow quantities. Integration of the resulting equations for the spectral mode strengths gives the evolution of all stochastic modes for all variables. Results show the spatiotemporal evolution of uncertainties in predicted quantities and highlight the dominant parameters contributing to these uncertainties during various flow phases. This work is supported by DARPA.

  14. Environment, Safety, and Health Self-Assessment Report, Fiscal Year 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chernowski, John

    2009-02-27

    Lawrence Berkeley National Laboratory's Environment, Safety, and Health (ES&H) Self-Assessment Program ensures that Integrated Safety Management (ISM) is implemented institutionally and by all divisions. The Self-Assessment Program, managed by the Office of Contract Assurance (OCA), provides for an internal evaluation of all ES&H programs and systems at LBNL. The functions of the program are to ensure that work is conducted safely, and with minimal negative impact to workers, the public, and the environment. The Self-Assessment Program is also the mechanism used to institute continuous improvements to the Laboratory's ES&H programs. The program is described in LBNL/PUB 5344, Environment, Safety, andmore » Health Self-Assessment Program and is composed of four distinct assessments: the Division Self-Assessment, the Management of Environment, Safety, and Health (MESH) review, ES&H Technical Assurance, and the Appendix B Self-Assessment. The Division Self-Assessment uses the five core functions and seven guiding principles of ISM as the basis of evaluation. Metrics are created to measure performance in fulfilling ISM core functions and guiding principles, as well as promoting compliance with applicable regulations. The five core functions of ISM are as follows: (1) Define the Scope of Work; (2) Identify and Analyze Hazards; (3) Control the Hazards; (4) Perform the Work; and (5) Feedback and Improvement. The seven guiding principles of ISM are as follows: (1) Line Management Responsibility for ES&H; (2) Clear Roles and Responsibilities; (3) Competence Commensurate with Responsibilities; (4) Balanced Priorities; (5) Identification of ES&H Standards and Requirements; (6) Hazard Controls Tailored to the Work Performed; and (7) Operations Authorization. Performance indicators are developed by consensus with OCA, representatives from each division, and Environment, Health, and Safety (EH&S) Division program managers. Line management of each division performs the Division

  15. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  16. Towards Measurement of Confidence in Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim

    2011-01-01

    Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations

  17. Impact of Pilot Delay and Non-Responsiveness on the Safety Performance of Airborne Separation

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria; Hoadley, Sherwood; Wing, David; Baxley, Brian; Allen, Bonnie Danette

    2008-01-01

    Assessing the safety effects of prediction errors and uncertainty on automationsupported functions in the Next Generation Air Transportation System concept of operations is of foremost importance, particularly safety critical functions such as separation that involve human decision-making. Both ground-based and airborne, the automation of separation functions must be designed to account for, and mitigate the impact of, information uncertainty and varying human response. This paper describes an experiment that addresses the potential impact of operator delay when interacting with separation support systems. In this study, we evaluated an airborne separation capability operated by a simulated pilot. The experimental runs are part of the Safety Performance of Airborne Separation (SPAS) experiment suite that examines the safety implications of prediction errors and system uncertainties on airborne separation assistance systems. Pilot actions required by the airborne separation automation to resolve traffic conflicts were delayed within a wide range, varying from five to 240 seconds while a percentage of randomly selected pilots were programmed to completely miss the conflict alerts and therefore take no action. Results indicate that the strategicAirborne Separation Assistance System (ASAS) functions exercised in the experiment can sustain pilot response delays of up to 90 seconds and more, depending on the traffic density. However, when pilots or operators fail to respond to conflict alerts the safety effects are substantial, particularly at higher traffic densities.

  18. Safety assessment of modified terephthalate polymers as used in cosmetics.

    PubMed

    Becker, Lillian C; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2014-01-01

    The safety of 6 modified terephthalate polymers as cosmetic ingredients was assessed. These ingredients mostly function as exfoliants, bulking agents, hair fixatives, and viscosity-increasing agents-nonaqueous. Polyethylene terephthalate (PET) is used in leave-on products up to 100% and in rinse-off products up to 2%. The Cosmetic Ingredient Review Expert Panel (Panel) considered that the PET used in cosmetics is chemically equivalent to that used in medical devices. The Panel determined that the Food and Drug Administration's determination of safety of PET in several medical devices, which included human and animal safety data, can be used as the basis for the determination of safety of PET and related polymers used in cosmetics. Use studies of cosmetic eye products that contain PET demonstrated no ocular irritation or dermal sensitization. The Panel concluded that modified terephthalate polymers were safe as cosmetic ingredients in the practices of use and concentration described in this safety assessment. © The Author(s) 2014.

  19. Uncertainty-Sensitive Heterogeneous Information Fusion: Assessing Threat with Soft, Uncertain, and Conflicting Evidence

    DTIC Science & Technology

    2016-01-01

    planning exercises and wargaming. Initial Experimentation Late in the research , the prototype platform and the various fusion methods came together. This...Chapter Four points to prior research 2 Uncertainty-Sensitive Heterogeneous Information Fusion in mind multimethod fusing of complex information...our research is assessing the threat of terrorism posed by individuals or groups under scrutiny. Broadly, the ultimate objec- tives, which go well

  20. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO

  1. Simulating geriatric home safety assessments in a three-dimensional virtual world.

    PubMed

    Andrade, Allen D; Cifuentes, Pedro; Mintzer, Michael J; Roos, Bernard A; Anam, Ramanakumar; Ruiz, Jorge G

    2012-01-01

    Virtual worlds could offer inexpensive and safe three-dimensional environments in which medical trainees can learn to identify home safety hazards. Our aim was to evaluate the feasibility, usability, and acceptability of virtual worlds for geriatric home safety assessments and to correlate performance efficiency in hazard identification with spatial ability, self-efficacy, cognitive load, and presence. In this study, 30 medical trainees found the home safety simulation easy to use, and their self-efficacy was improved. Men performed better than women in hazard identification. Presence and spatial ability were correlated significantly with performance. Educators should consider spatial ability and gender differences when implementing virtual world training for geriatric home safety assessments.

  2. Hierarchical Bayesian Approach To Reduce Uncertainty in the Aquatic Effect Assessment of Realistic Chemical Mixtures.

    PubMed

    Oldenkamp, Rik; Hendriks, Harrie W M; van de Meent, Dik; Ragas, Ad M J

    2015-09-01

    Species in the aquatic environment differ in their toxicological sensitivity to the various chemicals they encounter. In aquatic risk assessment, this interspecies variation is often quantified via species sensitivity distributions. Because the information available for the characterization of these distributions is typically limited, optimal use of information is essential to reduce uncertainty involved in the assessment. In the present study, we show that the credibility intervals on the estimated potentially affected fraction of species after exposure to a mixture of chemicals at environmentally relevant surface water concentrations can be extremely wide if a classical approach is followed, in which each chemical in the mixture is considered in isolation. As an alternative, we propose a hierarchical Bayesian approach, in which knowledge on the toxicity of chemicals other than those assessed is incorporated. A case study with a mixture of 13 pharmaceuticals demonstrates that this hierarchical approach results in more realistic estimations of the potentially affected fraction, as a result of reduced uncertainty in species sensitivity distributions for data-poor chemicals.

  3. Safety assessment of genetically modified plants with deliberately altered composition

    PubMed Central

    Halford, Nigel G; Hudson, Elizabeth; Gimson, Amy; Weightman, Richard; Shewry, Peter R; Tompkins, Steven

    2014-01-01

    The development and marketing of ‘novel’ genetically modified (GM) crops in which composition has been deliberately altered poses a challenge to the European Union (EU)'s risk assessment processes, which are based on the concept of substantial equivalence with a non-GM comparator. This article gives some examples of these novel GM crops and summarizes the conclusions of a report that was commissioned by the European Food Safety Authority on how the EU's risk assessment processes could be adapted to enable their safety to be assessed. PMID:24735114

  4. Reconciling Streamflow Uncertainty Estimation and River Bed Morphology Dynamics. Insights from a Probabilistic Assessment of Streamflow Uncertainties Using a Reliability Diagram

    NASA Astrophysics Data System (ADS)

    Morlot, T.; Mathevet, T.; Perret, C.; Favre Pugin, A. C.

    2014-12-01

    Streamflow uncertainty estimation has recently received a large attention in the literature. A dynamic rating curve assessment method has been introduced (Morlot et al., 2014). This dynamic method allows to compute a rating curve for each gauging and a continuous streamflow time-series, while calculating streamflow uncertainties. Streamflow uncertainty takes into account many sources of uncertainty (water level, rating curve interpolation and extrapolation, gauging aging, etc.) and produces an estimated distribution of streamflow for each days. In order to caracterise streamflow uncertainty, a probabilistic framework has been applied on a large sample of hydrometric stations of the Division Technique Générale (DTG) of Électricité de France (EDF) hydrometric network (>250 stations) in France. A reliability diagram (Wilks, 1995) has been constructed for some stations, based on the streamflow distribution estimated for a given day and compared to a real streamflow observation estimated via a gauging. To build a reliability diagram, we computed the probability of an observed streamflow (gauging), given the streamflow distribution. Then, the reliability diagram allows to check that the distribution of probabilities of non-exceedance of the gaugings follows a uniform law (i.e., quantiles should be equipropables). Given the shape of the reliability diagram, the probabilistic calibration is caracterised (underdispersion, overdispersion, bias) (Thyer et al., 2009). In this paper, we present case studies where reliability diagrams have different statistical properties for different periods. Compared to our knowledge of river bed morphology dynamic of these hydrometric stations, we show how reliability diagram gives us invaluable information on river bed movements, like a continuous digging or backfilling of the hydraulic control due to erosion or sedimentation processes. Hence, the careful analysis of reliability diagrams allows to reconcile statistics and long

  5. Changes of heart: the switch-value method for assessing value uncertainty.

    PubMed

    John, Leslie K; Fischhoff, Baruch

    2010-01-01

    Medical choices often evoke great value uncertainty, as patients face difficult, unfamiliar tradeoffs. Those seeking to aid such choices must be able to assess patients' ability to reduce that uncertainty, to reach stable, informed choices. The authors demonstrate a new method for evaluating how well people have articulated their preferences for difficult health decisions. The method uses 2 evaluative criteria. One is internal consistency, across formally equivalent ways of posing a choice. The 2nd is compliance with principles of prospect theory, indicating sufficient task mastery to respond in predictable ways. Subjects considered a hypothetical choice between noncurative surgery and palliative care, posed by a brain tumor. The choice options were characterized on 6 outcomes (e.g., pain, life expectancy, treatment risk), using a drug facts box display. After making an initial choice, subjects indicated their willingness to switch, given plausible changes in the outcomes. These changes involved either gains (improvements) in the unchosen option or losses (worsening) in the chosen one. A 2 x 2 mixed design manipulated focal change (gains v. losses) within subjects and change order between subjects. In this demonstration, subjects' preferences were generally consistent 1) with one another: with similar percentages willing to switch for gains and losses, and 2) with prospect theory, requiring larger gains than losses, to make those switches. Informed consent requires understanding decisions well enough to articulate coherent references. The authors' method allows assessing individuals' success in doing so.

  6. Uncertainty Quantification in Geomagnetic Field Modeling

    NASA Astrophysics Data System (ADS)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  7. Data collection and analysis for local roadway safety assessment.

    DOT National Transportation Integrated Search

    2014-11-01

    The project Data Analysis for Local Roadway : Assessment conducted systematic road-safety : assessment and identified major risks that can be el : iminated or reduced by pr : actical road-improvement : measures. Specifically, the primary task o...

  8. Ex-ante assessment of the safety effects of intelligent transport systems.

    PubMed

    Kulmala, Risto

    2010-07-01

    There is a need to develop a comprehensive framework for the safety assessment of Intelligent Transport Systems (ITS). This framework should: (1) cover all three dimensions of road safety-exposure, crash risk and consequence, (2) cover, in addition to the engineering effect, also the effects due to behavioural adaptation and (3) be compatible with the other aspects of state of the art road safety theories. A framework based on nine ITS safety mechanisms is proposed and discussed with regard to the requirements set to the framework. In order to illustrate the application of the framework in practice, the paper presents a method based on the framework and the results from applying that method for twelve intelligent vehicle systems in Europe. The framework is also compared to two recent frameworks applied in the safety assessment of intelligent vehicle safety systems. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Catch me if I fall! Enacted uncertainty avoidance and the social safety net as country-level moderators in the job insecurity-job attitudes link.

    PubMed

    Debus, Maike E; Probst, Tahira M; König, Cornelius J; Kleinmann, Martin

    2012-05-01

    Job insecurity is related to many detrimental outcomes, with reduced job satisfaction and affective organizational commitment being the 2 most prominent reactions. Yet, effect sizes vary greatly, suggesting the presence of moderator variables. On the basis of Lazarus's cognitive appraisal theory, we assumed that country-level enacted uncertainty avoidance and a country's social safety net would affect an individual's appraisal of job insecurity. More specifically, we hypothesized that these 2 country-level variables would buffer the negative relationships between job insecurity and the 2 aforementioned job attitudes. Combining 3 different data sources, we tested the hypotheses in a sample of 15,200 employees from 24 countries by applying multilevel modeling. The results confirmed the hypotheses that both enacted uncertainty avoidance and the social safety net act as cross-level buffer variables. Furthermore, our data revealed that the 2 cross-level interactions share variance in explaining the 2 job attitudes. Our study responds to calls to look at stress processes from a multilevel perspective and highlights the potential importance of governmental regulation when it comes to individual stress processes. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  10. Safety assessment guidance in the International Atomic Energy Agency RADWASS Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vovk, I.F.; Seitz, R.R.

    1995-12-31

    The IAEA RADWASS programme is aimed at establishing a coherent and comprehensive set of principles and standards for the safe management of waste and formulating the guidelines necessary for their application. A large portion of this programme has been devoted to safety assessments for various waste management activities. Five Safety Guides are planned to be developed to provide general guidance to enable operators and regulators to develop necessary framework for safety assessment process in accordance with international recommendations. They cover predisposal, near surface disposal, geological disposal, uranium/thorium mining and milling waste, and decommissioning and environmental restoration. The Guide on safetymore » assessment for near surface disposal is at the most advanced stage of preparation. This draft Safety Guide contains guidance on description of the disposal system, development of a conceptual model, identification and description of relevant scenarios and pathways, consequence analysis, presentation of results and confidence building. The set of RADWASS publications is currently undergoing in-depth review to ensure a harmonized approach throughout the Safety Series.« less

  11. Assessment of the safety of foods derived from genetically modified (GM) crops.

    PubMed

    König, A; Cockburn, A; Crevel, R W R; Debruyne, E; Grafstroem, R; Hammerling, U; Kimber, I; Knudsen, I; Kuiper, H A; Peijnenburg, A A C M; Penninks, A H; Poulsen, M; Schauzu, M; Wal, J M

    2004-07-01

    This paper provides guidance on how to assess the safety of foods derived from genetically modified crops (GM crops); it summarises conclusions and recommendations of Working Group 1 of the ENTRANSFOOD project. The paper provides an approach for adapting the test strategy to the characteristics of the modified crop and the introduced trait, and assessing potential unintended effects from the genetic modification. The proposed approach to safety assessment starts with the comparison of the new GM crop with a traditional counterpart that is generally accepted as safe based on a history of human food use (the concept of substantial equivalence). This case-focused approach ensures that foods derived from GM crops that have passed this extensive test-regime are as safe and nutritious as currently consumed plant-derived foods. The approach is suitable for current and future GM crops with more complex modifications. First, the paper reviews test methods developed for the risk assessment of chemicals, including food additives and pesticides, discussing which of these methods are suitable for the assessment of recombinant proteins and whole foods. Second, the paper presents a systematic approach to combine test methods for the safety assessment of foods derived from a specific GM crop. Third, the paper provides an overview on developments in this area that may prove of use in the safety assessment of GM crops, and recommendations for research priorities. It is concluded that the combination of existing test methods provides a sound test-regime to assess the safety of GM crops. Advances in our understanding of molecular biology, biochemistry, and nutrition may in future allow further improvement of test methods that will over time render the safety assessment of foods even more effective and informative. Copryright 2004 Elsevier Ltd.

  12. Computer-assisted uncertainty assessment of k0-NAA measurement results

    NASA Astrophysics Data System (ADS)

    Bučar, T.; Smodiš, B.

    2008-10-01

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.

  13. Estimation of Inherent Safety Margins in Loaded Commercial Spent Nuclear Fuel Casks

    DOE PAGES

    Banerjee, Kaushik; Robb, Kevin R.; Radulescu, Georgeta; ...

    2016-06-15

    We completed a novel assessment to determine the unquantified and uncredited safety margins (i.e., the difference between the licensing basis and as-loaded calculations) available in as-loaded spent nuclear fuel (SNF) casks. This assessment was performed as part of a broader effort to assess issues and uncertainties related to the continued safety of casks during extended storage and transportability following extended storage periods. Detailed analyses crediting the actual as-loaded cask inventory were performed for each of the casks at three decommissioned pressurized water reactor (PWR) sites to determine their characteristics relative to regulatory safety criteria for criticality, thermal, and shielding performance.more » These detailed analyses were performed in an automated fashion by employing a comprehensive and integrated data and analysis tool—Used Nuclear Fuel-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS). Calculated uncredited criticality margins from 0.07 to almost 0.30 Δk eff were observed; calculated decay heat margins ranged from 4 to almost 22 kW (as of 2014); and significant uncredited transportation dose rate margins were also observed. The results demonstrate that, at least for the casks analyzed here, significant uncredited safety margins are available that could potentially be used to compensate for SNF assembly and canister structural performance related uncertainties associated with long-term storage and subsequent transportation. The results also suggest that these inherent margins associated with how casks are loaded could support future changes in cask licensing to directly or indirectly credit the margins. Work continues to quantify the uncredited safety margins in the SNF casks loaded at other nuclear reactor sites.« less

  14. Assessment of elementary school safety restraint programs.

    DOT National Transportation Integrated Search

    1985-06-01

    The purpose of this research was to identify elementary school (K-6) safety belt : education programs in use in the United States, to review their development, and : to make administrative and impact assessments of their use in selected States. : Six...

  15. Ultraviolet safety assessments of insect light traps.

    PubMed

    Sliney, David H; Gilbert, David W; Lyon, Terry

    2016-01-01

    Near-ultraviolet (UV-A: 315-400 nm), "black-light," electric lamps were invented in 1935 and ultraviolet insect light traps (ILTs) were introduced for use in agriculture around that time. Today ILTs are used indoors in several industries and in food-service as well as in outdoor settings. With recent interest in photobiological lamp safety, safety standards are being developed to test for potentially hazardous ultraviolet emissions. A variety of UV "Black-light" ILTs were measured at a range of distances to assess potential exposures. Realistic time-weighted human exposures are shown to be well below current guidelines for human exposure to ultraviolet radiation. These UV-A exposures would be far less than the typical UV-A exposure in the outdoor environment. Proposals are made for realistic ultraviolet safety standards for ILT products.

  16. Safety Assessment of Acyl Glucuronides-A Simplified Paradigm.

    PubMed

    Smith, Dennis A; Hammond, Timothy; Baillie, Thomas A

    2018-06-01

    While simple O - (ether-linked) and N -glucuronide drug conjugates generally are unreactive and considered benign from a safety perspective, the acyl glucuronides that derive from metabolism of carboxylic acid-containing xenobiotics can exhibit a degree of chemical reactivity that is dependent upon their molecular structure. As a result, concerns have arisen over the safety of acyl glucuronides as a class, several members of which have been implicated in the toxicity of their respective parent drugs. However, direct evidence in support of these claims remains sparse, and due to frequently encountered species differences in the systemic exposure to acyl glucuronides (both of the parent drug and oxidized derivatives thereof), coupled with their instability in aqueous media and potential to undergo chemical rearrangement (acyl migration), qualification of these conjugates by traditional safety assessment methods can be very challenging. In this Commentary, we discuss alternative (non-acyl glucuronide) mechanisms by which carboxylic acids may cause serious adverse reactions, and propose a novel, practical approach to compare systemic exposure to acyl glucuronide metabolites in humans to that in animal species used in preclinical safety assessment based on relative estimates of the total body burden of these circulating conjugates. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.

  17. Uncertainty in the Modeling of Tsunami Sediment Transport

    NASA Astrophysics Data System (ADS)

    Jaffe, B. E.; Sugawara, D.; Goto, K.; Gelfenbaum, G. R.; La Selle, S.

    2016-12-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. A recent study (Jaffe et al., 2016) explores sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami properties, study site characteristics, available input data, sediment grain size, and the model used. Although uncertainty has the potential to be large, case studies for both forward and inverse models have shown that sediment transport modeling provides useful information on tsunami inundation and hydrodynamics that can be used to improve tsunami hazard assessment. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and the development of hybrid modeling approaches to exploit the strengths of forward and inverse models. As uncertainty in tsunami sediment transport modeling is reduced, and with increased ability to quantify uncertainty, the geologic record of tsunamis will become more valuable in the assessment of tsunami hazard. Jaffe, B., Goto, K., Sugawara, D., Gelfenbaum, G., and La Selle, S., "Uncertainty in Tsunami Sediment Transport Modeling", Journal of Disaster Research Vol. 11 No. 4, pp. 647-661, 2016, doi: 10.20965/jdr.2016.p0647 https://www.fujipress.jp/jdr/dr/dsstr001100040647/

  18. Uncertainty Quantification in High Throughput Screening: Applications to Models of Endocrine Disruption, Cytotoxicity, and Zebrafish Development (GRC Drug Safety)

    EPA Science Inventory

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...

  19. Comparative health and safety assessment of the SPS and alternative electrical generation systems

    NASA Astrophysics Data System (ADS)

    Habegger, L. J.; Gasper, J. R.; Brown, C. D.

    1980-07-01

    A comparative analysis of health and safety risks is presented for the Satellite Power System and five alternative baseload electrical generation systems: a low-Btu coal gasification system with an open-cycle gas turbine combined with a steam topping cycle; a light water fission reactor system without fuel reprocessing; a liquid metal fast breeder fission reactor system; a central station terrestrial photovoltaic system; and a first generation fusion system with magnetic confinement. For comparison, risk from a decentralized roof-top photovoltaic system with battery storage is also evaluated. Quantified estimates of public and occupational risks within ranges of uncertainty were developed for each phase of the energy system. The potential significance of related major health and safety issues that remain unquantitied are also discussed.

  20. Comparative health and safety assessment of the SPS and alternative electrical generation systems

    NASA Technical Reports Server (NTRS)

    Habegger, L. J.; Gasper, J. R.; Brown, C. D.

    1980-01-01

    A comparative analysis of health and safety risks is presented for the Satellite Power System and five alternative baseload electrical generation systems: a low-Btu coal gasification system with an open-cycle gas turbine combined with a steam topping cycle; a light water fission reactor system without fuel reprocessing; a liquid metal fast breeder fission reactor system; a central station terrestrial photovoltaic system; and a first generation fusion system with magnetic confinement. For comparison, risk from a decentralized roof-top photovoltaic system with battery storage is also evaluated. Quantified estimates of public and occupational risks within ranges of uncertainty were developed for each phase of the energy system. The potential significance of related major health and safety issues that remain unquantitied are also discussed.

  1. A TIERED APPROACH TO LIFE STAGES TESTING FOR AGRICULTURAL CHEMICAL SAFETY ASSESSMENT

    EPA Science Inventory

    A proposal has been developed by the Agricultural Chemical Safety Assessment (ACSA) Technical Committee of the ILSI Health and Environmental Sciences Institute (HESI) for an improved approach to assessing the safety of crop protection chemicals. The goal is to ensure that studie...

  2. Bayesian Assessment of the Uncertainties of Estimates of a Conceptual Rainfall-Runoff Model Parameters

    NASA Astrophysics Data System (ADS)

    Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.

    2014-12-01

    This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.

  3. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  4. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    NASA Astrophysics Data System (ADS)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  5. Parameter and input data uncertainty estimation for the assessment of water resources in two sub-basins of the Limpopo River Basin

    NASA Astrophysics Data System (ADS)

    Oosthuizen, Nadia; Hughes, Denis A.; Kapangaziwiri, Evison; Mwenge Kahinda, Jean-Marc; Mvandaba, Vuyelwa

    2018-05-01

    The demand for water resources is rapidly growing, placing more strain on access to water and its management. In order to appropriately manage water resources, there is a need to accurately quantify available water resources. Unfortunately, the data required for such assessment are frequently far from sufficient in terms of availability and quality, especially in southern Africa. In this study, the uncertainty related to the estimation of water resources of two sub-basins of the Limpopo River Basin - the Mogalakwena in South Africa and the Shashe shared between Botswana and Zimbabwe - is assessed. Input data (and model parameters) are significant sources of uncertainty that should be quantified. In southern Africa water use data are among the most unreliable sources of model input data because available databases generally consist of only licensed information and actual use is generally unknown. The study assesses how these uncertainties impact the estimation of surface water resources of the sub-basins. Data on farm reservoirs and irrigated areas from various sources were collected and used to run the model. Many farm dams and large irrigation areas are located in the upper parts of the Mogalakwena sub-basin. Results indicate that water use uncertainty is small. Nevertheless, the medium to low flows are clearly impacted. The simulated mean monthly flows at the outlet of the Mogalakwena sub-basin were between 22.62 and 24.68 Mm3 per month when incorporating only the uncertainty related to the main physical runoff generating parameters. The range of total predictive uncertainty of the model increased to between 22.15 and 24.99 Mm3 when water use data such as small farm and large reservoirs and irrigation were included. For the Shashe sub-basin incorporating only uncertainty related to the main runoff parameters resulted in mean monthly flows between 11.66 and 14.54 Mm3. The range of predictive uncertainty changed to between 11.66 and 17.72 Mm3 after the uncertainty

  6. Safety Assessment of Dialkyl Sulfosuccinate Salts as Used in Cosmetics.

    PubMed

    Fiume, Monice M; Heldreth, Bart; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2016-11-01

    The Cosmetic Ingredient Review (CIR) Expert Panel (Panel) assessed the safety of 8 dialkyl sulfosuccinate salts for use in cosmetics, finding that these ingredients are safe in cosmetics in the present practices of use and concentration when formulated to be nonirritating. The dialkyl sulfosuccinate salts primarily function as surfactants in cosmetics. The Panel reviewed the new and existing available animal and clinical data in making its determination of safety. The Panel found it appropriate to extrapolate the data on diethylhexyl sodium sulfosuccinate to assess the safety of the entire group because all of the diesters are of a similar alkyl chain length, all are symmetrically substituted, and all have similar functions in cosmetic formulations. © The Author(s) 2016.

  7. Assessment of patient safety culture in private and public hospitals in Peru.

    PubMed

    Arrieta, Alejandro; Suárez, Gabriela; Hakim, Galed

    2018-04-01

    To assess the patient safety culture in Peruvian hospitals from the perspective of healthcare professionals, and to test for differences between the private and public healthcare sectors. Patient safety is defined as the avoidance and prevention of patient injuries or adverse events resulting from the processes of healthcare delivery. A non-random cross-sectional study conducted online. An online survey was administered from July to August 2016, in Peru. This study reports results from Lima and Callao, which are the capital and the port region of Peru. A total of 1679 healthcare professionals completed the survey. Participants were physicians, medical residents and nurses working in healthcare facilities from the private sector and public sector. Assessment of the degree of patient safety and 12 dimensions of patient safety culture in hospital units as perceived by healthcare professionals. Only 18% of healthcare professionals assess the degree of patient safety in their unit of work as excellent or very good. Significant differences are observed between the patient safety grades in the private sector (37%) compared to the public sub-sectors (13-15%). Moreover, in all patient safety culture dimensions, healthcare professionals from the private sector give more favorable responses for patient safety, than those from the public sub-systems. The most significant difference in support comes from patient safety administrators through communication and information about errors. Overall, the degree of patient safety in Peru is low, with significant gaps that exist between the private and the public sectors.

  8. Assessing uncertainty in radar measurements on simplified meteorological scenarios

    NASA Astrophysics Data System (ADS)

    Molini, L.; Parodi, A.; Rebora, N.; Siccardi, F.

    2006-02-01

    A three-dimensional radar simulator model (RSM) developed by Haase (1998) is coupled with the nonhydrostatic mesoscale weather forecast model Lokal-Modell (LM). The radar simulator is able to model reflectivity measurements by using the following meteorological fields, generated by Lokal Modell, as inputs: temperature, pressure, water vapour content, cloud water content, cloud ice content, rain sedimentation flux and snow sedimentation flux. This work focuses on the assessment of some uncertainty sources associated with radar measurements: absorption by the atmospheric gases, e.g., molecular oxygen, water vapour, and nitrogen; attenuation due to the presence of a highly reflecting structure between the radar and a "target structure". RSM results for a simplified meteorological scenario, consisting of a humid updraft on a flat surface and four cells placed around it, are presented.

  9. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  10. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  11. An assessment of uncertainty in forest carbon budget projections

    Treesearch

    Linda S. Heath; James E. Smith

    2000-01-01

    Estimates of uncertainty are presented for projections of forest carbon inventory and average annual net carbon flux on private timberland in the US using the model FORCARB. Uncertainty in carbon inventory was approximately ±9% (2000 million metric tons) of the estimated median in the year 2000, rising to 11% (2800 million metric tons) in projection year 2040...

  12. Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.

    PubMed

    Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E

    2017-05-01

    Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty

  13. Method of operator safety assessment for underground mobile mining equipment

    NASA Astrophysics Data System (ADS)

    Działak, Paulina; Karliński, Jacek; Rusiński, Eugeniusz

    2018-01-01

    The paper presents a method of assessing the safety of operators of mobile mining equipment (MME), which is adapted to current and future geological and mining conditions. The authors focused on underground mines, with special consideration of copper mines (KGHM). As extraction reaches into deeper layers of the deposit it can activate natural hazards, which, thus far, have been considered unusual and whose range and intensity are different depending on the field of operation. One of the main hazards that affect work safety and can become the main barrier in the exploitation of deposits at greater depths is climate threat. The authors have analysed the phenomena which may impact the safety of MME operators, with consideration of accidents that have not yet been studied and are not covered by the current safety standards for this group of miners. An attempt was made to develop a method for assessing the safety of MME operators, which takes into account the mentioned natural hazards and which is adapted to current and future environmental conditions in underground mines.

  14. Safety assessment of discharge chute isolation barrier preparation and installation. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meichle, R.H.

    1994-10-10

    This revision responds to RL comments and increases the discussion of the ``effective hazard categorization`` and the readiness review basis. The safety assessment is made for the activities for the preparation and installation of the discharge chute isolation barriers. The safety assessment includes a hazard assessment and comparison of potential accidents/events to those addressed by the current safety basis documentation. No significant hazards were identified. An evaluation against the USQ evaluation questions were made and the determination made that the activities do not represent a USQ. Hazard categorization techniques were used to provide a basis for readiness review classification.

  15. Accepting uncertainty, assessing risk: decision quality in managing wildfire, forest resource values, and new technology

    Treesearch

    Jeffrey G. Borchers

    2005-01-01

    The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...

  16. Ultraviolet safety assessments of insect light traps

    PubMed Central

    Sliney, David H.; Gilbert, David W.; Lyon, Terry

    2016-01-01

    ABSTRACT Near-ultraviolet (UV-A: 315–400 nm), “black-light,” electric lamps were invented in 1935 and ultraviolet insect light traps (ILTs) were introduced for use in agriculture around that time. Today ILTs are used indoors in several industries and in food-service as well as in outdoor settings. With recent interest in photobiological lamp safety, safety standards are being developed to test for potentially hazardous ultraviolet emissions. A variety of UV “Black-light” ILTs were measured at a range of distances to assess potential exposures. Realistic time-weighted human exposures are shown to be well below current guidelines for human exposure to ultraviolet radiation. These UV-A exposures would be far less than the typical UV-A exposure in the outdoor environment. Proposals are made for realistic ultraviolet safety standards for ILT products. PMID:27043058

  17. Uncertainties in climate assessment for the case of aviation NO

    PubMed Central

    Holmes, Christopher D.; Tang, Qi; Prather, Michael J.

    2011-01-01

    Nitrogen oxides emitted from aircraft engines alter the chemistry of the atmosphere, perturbing the greenhouse gases methane (CH4) and ozone (O3). We quantify uncertainties in radiative forcing (RF) due to short-lived increases in O3, long-lived decreases in CH4 and O3, and their net effect, using the ensemble of published models and a factor decomposition of each forcing. The decomposition captures major features of the ensemble, and also shows which processes drive the total uncertainty in several climate metrics. Aviation-specific factors drive most of the uncertainty for the short-lived O3 and long-lived CH4 RFs, but a nonaviation factor dominates for long-lived O3. The model ensemble shows strong anticorrelation between the short-lived and long-lived RF perturbations (R2 = 0.87). Uncertainty in the net RF is highly sensitive to this correlation. We reproduce the correlation and ensemble spread in one model, showing that processes controlling the background tropospheric abundance of nitrogen oxides are likely responsible for the modeling uncertainty in climate impacts from aviation. PMID:21690364

  18. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    PubMed

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  19. Validity of instruments to assess students' travel and pedestrian safety.

    PubMed

    Mendoza, Jason A; Watson, Kathy; Baranowski, Tom; Nicklas, Theresa A; Uscanga, Doris K; Hanfling, Marcus J

    2010-05-18

    Safe Routes to School (SRTS) programs are designed to make walking and bicycling to school safe and accessible for children. Despite their growing popularity, few validated measures exist for assessing important outcomes such as type of student transport or pedestrian safety behaviors. This research validated the SRTS school travel survey and a pedestrian safety behavior checklist. Fourth grade students completed a brief written survey on how they got to school that day with set responses. Test-retest reliability was obtained 3-4 hours apart. Convergent validity of the SRTS travel survey was assessed by comparison to parents' report. For the measure of pedestrian safety behavior, 10 research assistants observed 29 students at a school intersection for completion of 8 selected pedestrian safety behaviors. Reliability was determined in two ways: correlations between the research assistants' ratings to that of the Principal Investigator (PI) and intraclass correlations (ICC) across research assistant ratings. The SRTS travel survey had high test-retest reliability (kappa = 0.97, n = 96, p < 0.001) and convergent validity (kappa = 0.87, n = 81, p < 0.001). The pedestrian safety behavior checklist had moderate reliability across research assistants' ratings (ICC = 0.48) and moderate correlation with the PI (r = 0.55, p = < 0.01). When two raters simultaneously used the instrument, the ICC increased to 0.65. Overall percent agreement (91%), sensitivity (85%) and specificity (83%) were acceptable. These validated instruments can be used to assess SRTS programs. The pedestrian safety behavior checklist may benefit from further formative work.

  20. Validity of instruments to assess students' travel and pedestrian safety

    PubMed Central

    2010-01-01

    Background Safe Routes to School (SRTS) programs are designed to make walking and bicycling to school safe and accessible for children. Despite their growing popularity, few validated measures exist for assessing important outcomes such as type of student transport or pedestrian safety behaviors. This research validated the SRTS school travel survey and a pedestrian safety behavior checklist. Methods Fourth grade students completed a brief written survey on how they got to school that day with set responses. Test-retest reliability was obtained 3-4 hours apart. Convergent validity of the SRTS travel survey was assessed by comparison to parents' report. For the measure of pedestrian safety behavior, 10 research assistants observed 29 students at a school intersection for completion of 8 selected pedestrian safety behaviors. Reliability was determined in two ways: correlations between the research assistants' ratings to that of the Principal Investigator (PI) and intraclass correlations (ICC) across research assistant ratings. Results The SRTS travel survey had high test-retest reliability (κ = 0.97, n = 96, p < 0.001) and convergent validity (κ = 0.87, n = 81, p < 0.001). The pedestrian safety behavior checklist had moderate reliability across research assistants' ratings (ICC = 0.48) and moderate correlation with the PI (r = 0.55, p =< 0.01). When two raters simultaneously used the instrument, the ICC increased to 0.65. Overall percent agreement (91%), sensitivity (85%) and specificity (83%) were acceptable. Conclusions These validated instruments can be used to assess SRTS programs. The pedestrian safety behavior checklist may benefit from further formative work. PMID:20482778

  1. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  2. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    USDA-ARS?s Scientific Manuscript database

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  3. Adverse Outcome Pathways can drive non-animal approaches for safety assessment

    PubMed Central

    Burden, Natalie; Sewell, Fiona; Andersen, Melvin E; Boobis, Alan; Chipman, J Kevin; Cronin, Mark T D; Hutchinson, Thomas H; Kimber, Ian; Whelan, Maurice

    2015-01-01

    Adverse Outcome Pathways (AOPs) provide an opportunity to develop new and more accurate safety assessment processes for drugs and other chemicals, and may ultimately play an important role in regulatory decision making. Not only can the development and application of AOPs pave the way for the development of improved evidence-based approaches for hazard and risk assessment, there is also the promise of a significant impact on animal welfare, with a reduced reliance on animal-based methods. The establishment of a useable and coherent knowledge framework under which AOPs will be developed and applied has been a first critical step towards realizing this opportunity. This article explores how the development of AOPs under this framework, and their application in practice, could benefit the science and practice of safety assessment, while in parallel stimulating a move away from traditional methods towards an increased acceptance of non-animal approaches. We discuss here the key areas where current, and future initiatives should be focused to enable the translation of AOPs into routine chemical safety assessment, and lasting 3Rs benefits. © 2015 The Authors. Journal of Applied Toxicology published by John Wiley & Sons Ltd. This article explores how the development and application of Adverse Outcome Pathways (AOPs) could benefit the science and practice of chemical safety assessment, with a particular focus on how their use in practice could reduce reliance on traditional animal toxicity tests. This includes discussion of the key areas where current and future initiatives should be focused to enable the translation of AOPs into routine chemical safety assessment, and lasting 3Rs benefits. PMID:25943792

  4. Including non-dietary sources into an exposure assessment of the European Food Safety Authority: The challenge of multi-sector chemicals such as Bisphenol A.

    PubMed

    von Goetz, N; Pirow, R; Hart, A; Bradley, E; Poças, F; Arcella, D; Lillegard, I T L; Simoneau, C; van Engelen, J; Husoy, T; Theobald, A; Leclercq, C

    2017-04-01

    In the most recent risk assessment for Bisphenol A for the first time a multi-route aggregate exposure assessment was conducted by the European Food Safety Authority. This assessment includes exposure via dietary sources, and also contributions of the most important non-dietary sources. Both average and high aggregate exposure were calculated by source-to-dose modeling (forward calculation) for different age groups and compared with estimates based on urinary biomonitoring data (backward calculation). The aggregate exposure estimates obtained by forward and backward modeling are in the same order of magnitude, with forward modeling yielding higher estimates associated with larger uncertainty. Yet, only forward modeling can indicate the relative contribution of different sources. Dietary exposure, especially via canned food, appears to be the most important exposure source and, based on the central aggregate exposure estimates, contributes around 90% to internal exposure to total (conjugated plus unconjugated) BPA. Dermal exposure via thermal paper and to a lesser extent via cosmetic products may contribute around 10% for some age groups. The uncertainty around these estimates is considerable, but since after dermal absorption a first-pass metabolism of BPA by conjugation is lacking, dermal sources may be of equal or even higher toxicological relevance than dietary sources. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Revealing and Resolving Patient Safety Defects: The Impact of Leadership WalkRounds on Frontline Caregiver Assessments of Patient Safety

    PubMed Central

    Frankel, Allan; Grillo, Sarah Pratt; Pittman, Mary; Thomas, Eric J; Horowitz, Lisa; Page, Martha; Sexton, Bryan

    2008-01-01

    Objective To evaluate the impact of rigorous WalkRounds on frontline caregiver assessments of safety climate, and to clarify the steps and implementation of rigorous WalkRounds. Data Sources/Study Setting Primary outcome variables were baseline and post WalkRounds safety climate scores from the Safety Attitudes Questionnaire (SAQ). Secondary outcomes were safety issues elicited through WalkRounds. Study period was August 2002 to April 2005; seven hospitals in Massachusetts agreed to participate; and the project was implemented in all patient care areas. Study Design Prospective study of the impact of rigorously applied WalkRounds on frontline caregivers assessments of safety climate in their patient care area. WalkRounds were conducted weekly and according to the seven-step WalkRounds Guide. The SAQ was administered at baseline and approximately 18 months post-WalkRounds implementation to all caregivers in patient care areas. Results Two of seven hospitals complied with the rigorous WalkRounds approach; hospital A was an academic teaching center and hospital B a community teaching hospital. Of 21 patient care areas, SAQ surveys were received from 62 percent of respondents at baseline and 60 percent post WalkRounds. At baseline, 10 of 21 care areas (48 percent) had safety climate scores below 60 percent, whereas post-WalkRounds three care areas (14 percent) had safety climate scores below 60 percent without improving by 10 points or more. Safety climate scale scores in hospital A were 62 percent at baseline and 77 percent post-WalkRounds (t=2.67, p=.03), and in hospital B were 46 percent at baseline and 56 percent post WalkRounds (t=2.06, p=.06). Main safety issues by category were equipment/facility (A [26 percent] and B [33 percent]) and communication (A [24 percent] and B [18 percent]). Conclusions WalkRounds implementation requires significant organizational will; sustainability requires outstanding project management and leadership engagement. In the patient

  6. Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies

    PubMed Central

    Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon

    2012-01-01

    Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600

  7. Application of the SCALE TSUNAMI Tools for the Validation of Criticality Safety Calculations Involving 233U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T; Hollenbach, Daniel F

    2009-02-01

    The Radiochemical Development Facility at Oak Ridge National Laboratory has been storing solid materials containing 233U for decades. Preparations are under way to process these materials into a form that is inherently safe from a nuclear criticality safety perspective. This will be accomplished by down-blending the {sup 233}U materials with depleted or natural uranium. At the request of the U.S. Department of Energy, a study has been performed using the SCALE sensitivity and uncertainty analysis tools to demonstrate how these tools could be used to validate nuclear criticality safety calculations of selected process and storage configurations. ISOTEK nuclear criticality safetymore » staff provided four models that are representative of the criticality safety calculations for which validation will be needed. The SCALE TSUNAMI-1D and TSUNAMI-3D sequences were used to generate energy-dependent k{sub eff} sensitivity profiles for each nuclide and reaction present in the four safety analysis models, also referred to as the applications, and in a large set of critical experiments. The SCALE TSUNAMI-IP module was used together with the sensitivity profiles and the cross-section uncertainty data contained in the SCALE covariance data files to propagate the cross-section uncertainties ({Delta}{sigma}/{sigma}) to k{sub eff} uncertainties ({Delta}k/k) for each application model. The SCALE TSUNAMI-IP module was also used to evaluate the similarity of each of the 672 critical experiments with each application. Results of the uncertainty analysis and similarity assessment are presented in this report. A total of 142 experiments were judged to be similar to application 1, and 68 experiments were judged to be similar to application 2. None of the 672 experiments were judged to be adequately similar to applications 3 and 4. Discussion of the uncertainty analysis and similarity assessment is provided for each of the four applications. Example upper subcritical limits (USLs) were

  8. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  9. A 2-year study of patient safety competency assessment in 29 clinical laboratories.

    PubMed

    Reed, Robyn C; Kim, Sara; Farquharson, Kara; Astion, Michael L

    2008-06-01

    Competency assessment is critical for laboratory operations and is mandated by the Clinical Laboratory Improvement Amendments of 1988. However, no previous reports describe methods for assessing competency in patient safety. We developed and implemented a Web-based tool to assess performance of 875 laboratory staff from 29 laboratories in patient safety. Question categories included workplace culture, categorizing error, prioritization of patient safety interventions, strength of specific interventions, and general patient safety concepts. The mean score was 85.0%, with individual scores ranging from 56% to 100% and scores by category from 81.3% to 88.6%. Of the most difficult questions (<72% correct), 6 were about intervention strength, 3 about categorizing error, 1 about workplace culture, and 1 about prioritization of interventions. Of the 13 questions about intervention strength, 6 (46%) were in the lowest quartile, suggesting that this may be a difficult topic for laboratory technologists. Computer-based competency assessments help laboratories identify topics for continuing education in patient safety.

  10. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Safety assessment for In-service Pressure Bending Pipe Containing Incomplete Penetration Defects

    NASA Astrophysics Data System (ADS)

    Wang, M.; Tang, P.; Xia, J. F.; Ling, Z. W.; Cai, G. Y.

    2017-12-01

    Incomplete penetration defect is a common defect in the welded joint of pressure pipes. While the safety classification of pressure pipe containing incomplete penetration defects, according to periodical inspection regulations in present, is more conservative. For reducing the repair of incomplete penetration defect, a scientific and applicable safety assessment method for pressure pipe is needed. In this paper, the stress analysis model of the pipe system was established for the in-service pressure bending pipe containing incomplete penetration defects. The local finite element model was set up to analyze the stress distribution of defect location and the stress linearization. And then, the applicability of two assessment methods, simplified assessment and U factor assessment method, to the assessment of incomplete penetration defects located at pressure bending pipe were analyzed. The results can provide some technical supports for the safety assessment of complex pipelines in the future.

  12. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Analyzing system safety in lithium-ion grid energy storage

    DOE PAGES

    Rosewater, David; Williams, Adam

    2015-10-08

    As grid energy storage systems become more complex, it grows more di cult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to ll the gaps recognized in PRA for designing complex systems and hence be more e ectivemore » or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. Lastly, we conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.« less

  14. Analyzing system safety in lithium-ion grid energy storage

    NASA Astrophysics Data System (ADS)

    Rosewater, David; Williams, Adam

    2015-12-01

    As grid energy storage systems become more complex, it grows more difficult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to fill the gaps recognized in PRA for designing complex systems and hence be more effective or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. We conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.

  15. Current issues and perspectives in food safety and risk assessment.

    PubMed

    Eisenbrand, G

    2015-12-01

    In this review, current issues and opportunities in food safety assessment are discussed. Food safety is considered an essential element inherent in global food security. Hazard characterization is pivotal within the continuum of risk assessment, but it may be conceived only within a very limited frame as a true alternative to risk assessment. Elucidation of the mode of action underlying a given hazard is vital to create a plausible basis for human toxicology evaluation. Risk assessment, to convey meaningful risk communication, must be based on appropriate and reliable consideration of both exposure and mode of action. New perspectives, provided by monitoring human exogenous and endogenous exposure biomarkers, are considered of great promise to support classical risk extrapolation from animal toxicology. © The Author(s) 2015.

  16. Safety analysis, risk assessment, and risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamali, K.; Stack, D.W.; Sullivan, L.H.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less

  17. Assessing the near-term risk of climate uncertainty : interdependencies among the U.S. states.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.

    2010-04-01

    Policy makers will most likely need to make decisions about climate policy before climate scientists have resolved all relevant uncertainties about the impacts of climate change. This study demonstrates a risk-assessment methodology for evaluating uncertain future climatic conditions. We estimate the impacts of climate change on U.S. state- and national-level economic activity from 2010 to 2050. To understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions to mitigate the course of climate change, we focus on precipitation, one of the most uncertain aspects of future climate change. We use results of the climate-modelmore » ensemble from the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) as a proxy for representing climate uncertainty over the next 40 years, map the simulated weather from the climate models hydrologically to the county level to determine the physical consequences on economic activity at the state level, and perform a detailed 70-industry analysis of economic impacts among the interacting lower-48 states. We determine the industry-level contribution to the gross domestic product and employment impacts at the state level, as well as interstate population migration, effects on personal income, and consequences for the U.S. trade balance. We show that the mean or average risk of damage to the U.S. economy from climate change, at the national level, is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs.« less

  18. Risk assessment principle for engineered nanotechnology in food and drug.

    PubMed

    Hwang, Myungsil; Lee, Eun Ji; Kweon, Se Young; Park, Mi Sun; Jeong, Ji Yoon; Um, Jun Ho; Kim, Sun Ah; Han, Bum Suk; Lee, Kwang Ho; Yoon, Hae Jung

    2012-06-01

    While the ability to develop nanomaterials and incorporate them into products is advancing rapidly worldwide, understanding of the potential health safety effects of nanomaterials has proceeded at a much slower pace. Since 2008, Korea Food and Drug Administration (KFDA) started an investigation to prepare "Strategic Action Plan" to evaluate safety and nano risk management associated with foods, drugs, medical devices and cosmetics using nano-scale materials. Although there are some studies related to potential risk of nanomaterials, physical-chemical characterization of nanomaterials is not clear yet and these do not offer enough information due to their limitations. Their uncertainties make it impossible to determine whether nanomaterials are actually hazardous to human. According to the above mention, we have some problems to conduct the human exposure risk assessment currently. On the other hand, uncertainty about safety may lead to polarized public debate and to businesses unwillingness for further nanotechnology investigation. Therefore, the criteria and methods to assess possible adverse effects of nanomaterials have been vigorously taken into consideration by many international organizations: the World Health Organization, the Organization for Economic and Commercial Development and the European Commission. The object of this study was to develop risk assessment principles for safety management of future nanoproducts and also to identify areas of research to strengthen risk assessment for nanomaterials. The research roadmaps which were proposed in this study will be helpful to fill up the current gaps in knowledge relevant nano risk assessment.

  19. Using driving simulators to assess driving safety.

    PubMed

    Boyle, Linda Ng; Lee, John D

    2010-05-01

    Changes in drivers, vehicles, and roadways pose substantial challenges to the transportation safety community. Crash records and naturalistic driving data are useful for examining the influence of past or existing technology on drivers, and the associations between risk factors and crashes. However, they are limited because causation cannot be established and technology not yet installed in production vehicles cannot be assessed. Driving simulators have become an increasingly widespread tool to understand evolving and novel technologies. The ability to manipulate independent variables in a randomized, controlled setting also provides the added benefit of identifying causal links. This paper introduces a special issue on simulator-based safety studies. The special issue comprises 25 papers that demonstrate the use of driving simulators to address pressing transportation safety problems and includes topics as diverse as neurological dysfunction, work zone design, and driver distraction. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  20. Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness

    NASA Astrophysics Data System (ADS)

    Irias, X.; Cicala, D.

    2013-12-01

    Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is

  1. INTERPRETING SPONTANEOUS RENAL LESIONS IN SAFETY AND RISK ASSESSMENT

    EPA Science Inventory

    Interpreting Spontaneous Renal Lesions in Safety and Risk Assessment
    Douglas C. Wolf, D.V.M., Ph.D.

    Introduction

    Risk assessment is a process whereby the potential adverse health effects from exposure to a xenobiotic are predicted after evaluation of the availab...

  2. Dynamic rating curve assessment for hydrometric stations and computation of the associated uncertainties: Quality and station management indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan

    2014-09-01

    A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.

  3. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  4. Application of Non-Deterministic Methods to Assess Modeling Uncertainties for Reinforced Carbon-Carbon Debris Impacts

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Fasanella, Edwin L.; Melis, Matthew; Carney, Kelly; Gabrys, Jonathan

    2004-01-01

    The Space Shuttle Columbia Accident Investigation Board (CAIB) made several recommendations for improving the NASA Space Shuttle Program. An extensive experimental and analytical program has been developed to address two recommendations related to structural impact analysis. The objective of the present work is to demonstrate the application of probabilistic analysis to assess the effect of uncertainties on debris impacts on Space Shuttle Reinforced Carbon-Carbon (RCC) panels. The probabilistic analysis is used to identify the material modeling parameters controlling the uncertainty. A comparison of the finite element results with limited experimental data provided confidence that the simulations were adequately representing the global response of the material. Five input parameters were identified as significantly controlling the response.

  5. Surrogate Safety Assessment Model (SSAM)--software user manual

    DOT National Transportation Integrated Search

    2008-05-01

    This document presents guidelines for the installation and use of the Surrogate Safety Assessment Model (SSAM) software. For more information regarding the SSAM application, including discussion of theoretical background and the results of a series o...

  6. Geographical scenario uncertainty in generic fate and exposure factors of toxic pollutants for life-cycle impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huijbregts, Mark A.J.; Lundi, Sven; McKone, Thomas E.

    In environmental life-cycle assessments (LCA), fate and exposure factors account for the general fate and exposure properties of chemicals under generic environmental conditions by means of 'evaluative' multi-media fate and exposure box models. To assess the effect of using different generic environmental conditions, fate and exposure factors of chemicals emitted under typical conditions of (1) Western Europe, (2) Australia and (3) the United States of America were compared with the multi-media fate and exposure box model USES-LCA. Comparing the results of the three evaluative environments, it was found that the uncertainty in fate and exposure factors for ecosystems and humansmore » due to choice of an evaluative environment, as represented by the ratio of the 97.5th and 50th percentile, is between a factor 2 and 10. Particularly, fate and exposure factors of emissions causing effects in fresh water ecosystems and effects on human health have relatively high uncertainty. This uncertainty i s mainly caused by the continental difference in the average soil erosion rate, the dimensions of the fresh water and agricultural soil compartment, and the fraction of drinking water coming from ground water.« less

  7. A Framework for Assessment of Aviation Safety Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  8. AGRICULTURAL CHEMICAL SAFETY ASSESSMENT: A MULTISECTOR APPROACH TO THE MODERNIZATION OF HUMAN SAFETY REQUIREMENTS.

    EPA Science Inventory

    Better understanding of toxicological mechanisms, enhanced testing capabilities, and demands for more sophisticated data for safety and health risk assessment have generated international interest in improving the current testing paradigm for agricultural chemicals. To address th...

  9. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  10. Safety Assessment of Alkyl Ethylhexanoates as Used in Cosmetics.

    PubMed

    Fiume, Monice; Heldreth, Bart; Bergfeld, Wilma F; Belsito, Donald V; Hill, Ronald A; Klaassen, Curtis D; Liebler, Daniel C; Marks, James G; Shank, Ronald C; Slaga, Thomas J; Snyder, Paul W; Andersen, F Alan

    2015-01-01

    The Cosmetic Ingredient Review (CIR) Expert Panel (Panel) assessed the safety of 16 alkyl ethylhexanoates for use in cosmetics, concluding that these ingredients are safe in cosmetic formulations in the present practices of use and concentrations when formulated to be nonirritating. The alkyl ethylhexanoates primarily function as skin-conditioning agents in cosmetics. The highest concentration of use reported for any of the alkyl ethylhexanoates is 77.3% cetyl ethylhexanoate in rinse-off formulations used near the eye, and the highest leave-on use reported is 52% cetyl ethylhexanoate in lipstick formulations. The Panel reviewed available animal and clinical data related to these ingredients, and the similarities in structure, properties, functions, and uses of ingredients from previous CIR assessments on constituent alcohols that allowed for extrapolation of the available toxicological data to assess the safety of the entire group. © The Author(s) 2015.

  11. Climate uncertainty and implications for U.S. state-level risk assessment through 2050.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.

    2009-10-01

    Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to themore » economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.« less

  12. Eating nanomaterials: cruelty-free and safe? the EFSA guidance on risk assessment of nanomaterials in food and feed.

    PubMed

    Sauer, Ursula G

    2011-12-01

    Nanomaterials are increasingly being added to food handling and packaging materials, or directly, to human food and animal feed. To ensure the safety of such engineered nanomaterials (ENMs), in May 2011, the European Food Safety Authority (EFSA) published a guidance document on Risk assessment of the application of nanoscience and nanotechnologies in the food and feed chain. It states that risk assessment should be performed by following a step-wise procedure. Whenever human or animal exposure to nanomaterials is expected, the general hazard characterisation scheme requests information from in vitro genotoxicity, toxicokinetic and repeated dose 90-day oral toxicity studies in rodents. Numerous prevailing uncertainties with regard to nanomaterial characterisation and their hazard and risk assessment are addressed in the guidance document. This article discusses the impact of these knowledge gaps on meeting the goal of ensuring human safety. The EFSA's guidance on the risk assessment of ENMs in food and animal feed is taken as an example for discussion, from the point of view of animal welfare, on what level of uncertainty should be considered acceptable for human safety assessment of products with non-medical applications, and whether animal testing should be considered ethically acceptable for such products.

  13. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    NASA Astrophysics Data System (ADS)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  14. Is it necessary to plan with safety margins for actively scanned proton therapy?

    NASA Astrophysics Data System (ADS)

    Albertini, F.; Hug, E. B.; Lomax, A. J.

    2011-07-01

    In radiation therapy, a plan is robust if the calculated and the delivered dose are in agreement, even in the case of different uncertainties. The current practice is to use safety margins, expanding the clinical target volume sufficiently enough to account for treatment uncertainties. This, however, might not be ideal for proton therapy and in particular when using intensity modulated proton therapy (IMPT) plans as degradation in the dose conformity could also be found in the middle of the target resulting from misalignments of highly in-field dose gradients. Single field uniform dose (SFUD) and IMPT plans have been calculated for different anatomical sites and the need for margins has been assessed by analyzing plan robustness to set-up and range uncertainties. We found that the use of safety margins is a good way to improve plan robustness for SFUD and IMPT plans with low in-field dose gradients but not necessarily for highly modulated IMPT plans for which only a marginal improvement in plan robustness could be detected through the definition of a planning target volume.

  15. Introduction of risk size in the determination of uncertainty factor UFL in risk assessment

    NASA Astrophysics Data System (ADS)

    Xue, Jinling; Lu, Yun; Velasquez, Natalia; Yu, Ruozhen; Hu, Hongying; Liu, Zhengtao; Meng, Wei

    2012-09-01

    The methodology for using uncertainty factors in health risk assessment has been developed for several decades. A default value is usually applied for the uncertainty factor UFL, which is used to extrapolate from LOAEL (lowest observed adverse effect level) to NAEL (no adverse effect level). Here, we have developed a new method that establishes a linear relationship between UFL and the additional risk level at LOAEL based on the dose-response information, which represents a very important factor that should be carefully considered. This linear formula makes it possible to select UFL properly in the additional risk range from 5.3% to 16.2%. Also the results remind us that the default value 10 may not be conservative enough when the additional risk level at LOAEL exceeds 16.2%. Furthermore, this novel method not only provides a flexible UFL instead of the traditional default value, but also can ensure a conservative estimation of the UFL with fewer errors, and avoid the benchmark response selection involved in the benchmark dose method. These advantages can improve the estimation of the extrapolation starting point in the risk assessment.

  16. Integration of expert knowledge and uncertainty in natural risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and

  17. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  18. Assessment of insurance incentives for safety belt usage

    DOT National Transportation Integrated Search

    1983-05-12

    This study assesses the feasibility of insurance companies to offer incentives, in the form of premium reductions or additional benefits, which would be effective in increasing safety belt usage. The insurance types considered in this report are auto...

  19. European drought under climate change and an assessment of the uncertainties in projections

    NASA Astrophysics Data System (ADS)

    Yu, R. M. S.; Osborn, T.; Conway, D.; Warren, R.; Hankin, R.

    2012-04-01

    Extreme weather/climate events have significant environmental and societal impacts, and anthropogenic climate change has and will continue to alter their characteristics (IPCC, 2011). Drought is one of the most damaging natural hazards through its effects on agricultural, hydrological, ecological and socio-economic systems. Climate change is stimulating demand, from public and private sector decision-makers and also other stakeholders, for better understanding of potential future drought patterns which could facilitate disaster risk management. There remain considerable levels of uncertainty in climate change projections, particularly in relation to extreme events. Our incomplete understanding of the behaviour of the climate system has led to the development of various emission scenarios, carbon cycle models and global climate models (GCMs). Uncertainties arise also from the different types and definitions of drought. This study examines climate change-induced changes in European drought characteristics, and illustrates the robustness of these projections by quantifying the effects of using different emission scenarios, carbon cycle models and GCMs. This is achieved by using the multi-institutional modular "Community Integrated Assessment System (CIAS)" (Warren et al., 2008), a flexible integrated assessment system for modelling climate change. Simulations generated by the simple climate model MAGICC6.0 are assessed. These include ten C4MIP carbon cycle models and eighteen CMIP3 GCMs under five IPCC SRES emission scenarios, four Representative Concentration Pathway (RCP) scenarios, and three mitigation scenarios with CO2-equivalent levels stabilising at 550 ppm, 500 ppm and 450 ppm. Using an ensemble of 2160 future precipitation scenarios, we present an analysis on both short (3-month) and long (12-month) meteorological droughts based on the Standardised Precipitation Index (SPI) for the baseline period (1951-2000) and two future periods of 2001-2050 and 2051

  20. Health technology assessment and primary data collection for reducing uncertainty in decision making.

    PubMed

    Goeree, Ron; Levin, Les; Chandra, Kiran; Bowen, James M; Blackhouse, Gord; Tarride, Jean-Eric; Burke, Natasha; Bischof, Matthias; Xie, Feng; O'Reilly, Daria

    2009-05-01

    Health care expenditures continue to escalate, and pressures for increased spending will continue. Health care decision makers from publicly financed systems, private insurance companies, or even from individual health care institutions, will continue to be faced with making difficult purchasing, access, and reimbursement decisions. As a result, decision makers are increasingly turning to evidence-based platforms to help control costs and make the most efficient use of existing resources. Most tools used to assist with evidence-based decision making focus on clinical outcomes. Health technology assessment (HTA) is increasing in popularity because it also considers other factors important for decision making, such as cost, social and ethical values, legal issues, and factors such as the feasibility of implementation. In some jurisdictions, HTAs have also been supplemented with primary data collection to help address uncertainty that may still exist after conducting a traditional HTA. The HTA process adopted in Ontario, Canada, is unique in that assessments are also made to determine what primary data research should be conducted and what should be collected in these studies. In this article, concerns with the traditional HTA process are discussed, followed by a description of the HTA process that has been established in Ontario, with a particular focus on the data collection program followed by the Programs for Assessment of Technology in Health Research Institute. An illustrative example is used to show how the Ontario HTA process works and the role value of information analyses plays in addressing decision uncertainty, determining research feasibility, and determining study data collection needs.

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 10 Appendix I - Historical Evaporation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  2. County-Level Climate Uncertainty for Risk Assessments: Volume 8 Appendix G - Historical Precipitation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  4. Radioactive waste management in France: safety demonstration fundamentals.

    PubMed

    Ouzounian, G; Voinis, S; Boissier, F

    2012-01-01

    The main challenge in development of the safety case for deep geological disposal is associated with the long periods of time over which high- and intermediate-level long-lived wastes remain hazardous. A wide range of events and processes may occur over hundreds of thousands of years. These events and processes are characterised by specific timescales. For example, the timescale for heat generation is much shorter than any geological timescale. Therefore, to reach a high level of reliability in the safety case, it is essential to have a thorough understanding of the sequence of events and processes likely to occur over the lifetime of the repository. It then becomes possible to assess the capability of the repository to fulfil its safety functions. However, due to the long periods of time and the complexity of the events and processes likely to occur, uncertainties related to all processes, data, and models need to be understood and addressed. Assessment is required over the lifetime of the radionuclides contained in the radioactive waste. Copyright © 2012. Published by Elsevier Ltd.

  5. Combining uncertainty factors in deriving human exposure levels of noncarcinogenic toxicants.

    PubMed

    Kodell, R L; Gaylor, D W

    1999-01-01

    Acceptable levels of human exposure to noncarcinogenic toxicants in environmental and occupational settings generally are derived by reducing experimental no-observed-adverse-effect levels (NOAELs) or benchmark doses (BDs) by a product of uncertainty factors (Barnes and Dourson, Ref. 1). These factors are presumed to ensure safety by accounting for uncertainty in dose extrapolation, uncertainty in duration extrapolation, differential sensitivity between humans and animals, and differential sensitivity among humans. The common default value for each uncertainty factor is 10. This paper shows how estimates of means and standard deviations of the approximately log-normal distributions of individual uncertainty factors can be used to estimate percentiles of the distribution of the product of uncertainty factors. An appropriately selected upper percentile, for example, 95th or 99th, of the distribution of the product can be used as a combined uncertainty factor to replace the conventional product of default factors.

  6. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  7. Hazard Identification and Risk Assessment of Health and Safety Approach JSA (Job Safety Analysis) in Plantation Company

    NASA Astrophysics Data System (ADS)

    Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi

    2017-06-01

    Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and had the working level with extreme risk with the risk value range was above 20. So to minimize the accidents could provide Personal Protective Equipment (PPE) which were appropriate, information about health and safety, the company should have watched the activities of workers, and rewards for the workers who obey the rules that applied in the plantation.

  8. Assessment of Competence in Clinical Reasoning and Decision-Making under Uncertainty: The Script Concordance Test Method

    ERIC Educational Resources Information Center

    Ramaekers, Stephan; Kremer, Wim; Pilot, Albert; van Beukelen, Peter; van Keulen, Hanno

    2010-01-01

    Real-life, complex problems often require that decisions are made despite limited information or insufficient time to explore all relevant aspects. Incorporating authentic uncertainties into an assessment, however, poses problems in establishing results and analysing their methodological qualities. This study aims at developing a test on clinical…

  9. A Microbial Assessment Scheme to measure microbial performance of Food Safety Management Systems.

    PubMed

    Jacxsens, L; Kussaga, J; Luning, P A; Van der Spiegel, M; Devlieghere, F; Uyttendaele, M

    2009-08-31

    A Food Safety Management System (FSMS) implemented in a food processing industry is based on Good Hygienic Practices (GHP), Hazard Analysis Critical Control Point (HACCP) principles and should address both food safety control and assurance activities in order to guarantee food safety. One of the most emerging challenges is to assess the performance of a present FSMS. The objective of this work is to explain the development of a Microbial Assessment Scheme (MAS) as a tool for a systematic analysis of microbial counts in order to assess the current microbial performance of an implemented FSMS. It is assumed that low numbers of microorganisms and small variations in microbial counts indicate an effective FSMS. The MAS is a procedure that defines the identification of critical sampling locations, the selection of microbiological parameters, the assessment of sampling frequency, the selection of sampling method and method of analysis, and finally data processing and interpretation. Based on the MAS assessment, microbial safety level profiles can be derived, indicating which microorganisms and to what extent they contribute to food safety for a specific food processing company. The MAS concept is illustrated with a case study in the pork processing industry, where ready-to-eat meat products are produced (cured, cooked ham and cured, dried bacon).

  10. Aligning the 3Rs with new paradigms in the safety assessment of chemicals.

    PubMed

    Burden, Natalie; Mahony, Catherine; Müller, Boris P; Terry, Claire; Westmoreland, Carl; Kimber, Ian

    2015-04-01

    There are currently several factors driving a move away from the reliance on in vivo toxicity testing for the purposes of chemical safety assessment. Progress has started to be made in the development and validation of non-animal methods. However, recent advances in the biosciences provide exciting opportunities to accelerate this process and to ensure that the alternative paradigms for hazard identification and risk assessment deliver lasting 3Rs benefits, whilst improving the quality and relevance of safety assessment. The NC3Rs, a UK-based scientific organisation which supports the development and application of novel 3Rs techniques and approaches, held a workshop recently which brought together over 20 international experts in the field of chemical safety assessment. The aim of this workshop was to review the current scientific, technical and regulatory landscapes, and to identify key opportunities towards reaching these goals. Here, we consider areas where further strategic investment will need to be focused if significant impact on 3Rs is to be matched with improved safety science, and why the timing is right for the field to work together towards an environment where we no longer rely on whole animal data for the accurate safety assessment of chemicals.

  11. NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert

    2011-01-01

    System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual

  12. Integrated assessment of urban drainage system under the framework of uncertainty analysis.

    PubMed

    Dong, X; Chen, J; Zeng, S; Zhao, D

    2008-01-01

    Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.

  13. Uncertainty Assessment of Space-Borne Passive Soil Moisture Retrievals

    NASA Technical Reports Server (NTRS)

    Quets, Jan; De Lannoy, Gabrielle; Reichle, Rolf; Cosh, Michael; van der Schalie, Robin; Wigneron, Jean-Pierre

    2017-01-01

    The uncertainty associated with passive soil moisture retrieval is hard to quantify, and known to be underlain by various, diverse, and complex causes. Factors affecting space-borne retrieved soil moisture estimation include: (i) the optimization or inversion method applied to the radiative transfer model (RTM), such as e.g. the Single Channel Algorithm (SCA), or the Land Parameter Retrieval Model (LPRM), (ii) the selection of the observed brightness temperatures (Tbs), e.g. polarization and incidence angle, (iii) the definition of the cost function and the impact of prior information in it, and (iv) the RTM parameterization (e.g. parameterizations officially used by the SMOS L2 and SMAP L2 retrieval products, ECMWF-based SMOS assimilation product, SMAP L4 assimilation product, and perturbations from those configurations). This study aims at disentangling the relative importance of the above-mentioned sources of uncertainty, by carrying out soil moisture retrieval experiments, using SMOS Tb observations in different settings, of which some are mentioned above. The ensemble uncertainties are evaluated at 11 reference CalVal sites, over a time period of more than 5 years. These experimental retrievals were inter-compared, and further confronted with in situ soil moisture measurements and operational SMOS L2 retrievals, using commonly used skill metrics to quantify the temporal uncertainty in the retrievals.

  14. Terrain Safety Assessment in Support of the Mars Science Laboratory Mission

    NASA Technical Reports Server (NTRS)

    Kipp, Devin

    2012-01-01

    In August 2012, the Mars Science Laboratory (MSL) mission will pioneer the next generation of robotic Entry, Descent, and Landing (EDL) systems by delivering the largest and most capable rover to date to the surface of Mars. The process to select the MSL landing site took over five years and began with over 50 initial candidate sites from which four finalist sites were chosen. The four finalist sites were examined in detail to assess overall science merit, EDL safety, and rover traversability on the surface. Ultimately, the engineering assessments demonstrated a high level of safety and robustness at all four finalist sites and differences in the assessment across those sites were small enough that neither EDL safety nor rover traversability considerations could significantly discriminate among the final four sites. Thus the MSL landing site at Gale Crater was selected from among the four finalists primarily on the basis of science considerations.

  15. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  16. Assessing the uncertainty of forest carbon estimates using the FVS family of diameter increment equations

    Treesearch

    Matthew B. Russell; Aaron R. Weiskittel; Anthony W. D’Amato

    2012-01-01

    Serving as a carbon (C) accounting tool, the Forest Vegetation Simulator (FVS) is widely used by forest managers and researchers to forecast future forest C stocks. Assessments of the uncertainty that FVS equations provide in terms of their ability to accurately project forest biomass and C would seemingly differ, depending on the region and scale of interest to the...

  17. Assessing Explosives Safety Risks, Deviations, And Consequences

    DTIC Science & Technology

    2009-07-31

    Technical Paper 23 31 July 2009 DDESB Assessing Explosives Safety Risks, Deviations, And Consequences ...Deviations, And Consequences 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...and approaches to assist warfighters in executing their mission, conserving resources, and maximizing operational effectiveness . When mission risk

  18. Influence of safety measures on the risks of transporting dangerous goods through road tunnels.

    PubMed

    Saccomanno, Frank; Haastrup, Palle

    2002-12-01

    Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.

  19. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  20. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    NASA Astrophysics Data System (ADS)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  1. Preliminary Marine Safety Risk Assessment, Brandon Road Lock and Dam Invasive Species Control Measures

    DTIC Science & Technology

    2016-12-01

    i Classification | CG-926 RDC | author | audience | month year Preliminary Marine Safety Risk Assessment, Brandon Road Lock & Dam...No. 4. Title and Subtitle Preliminary Marine Safety Risk Assessment, Brandon Road Lock & Dam Invasive Species Control Measures 5. Report Date...safety due to proposed invasive species control measures located in the vicinity of the Brandon Road Lock and Dam (BRLD) Navigation Project on the

  2. C-Band Airport Surface Communications System Engineering-Initial High-Level Safety Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Zelkin, Natalie; Henriksen, Stephen

    2011-01-01

    This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed C-band (5091- to 5150-MHz) airport surface communication system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents an initial high-level safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the C-band communication system after the profile is finalized and system rollout timing is determined. A security risk assessment has been performed by NASA as a parallel activity. While safety analysis is concerned with a prevention of accidental errors and failures, the security threat analysis focuses on deliberate attacks. Both processes identify the events that affect operation of the system; and from a safety perspective the security threats may present safety risks.

  3. Safety Assessment of Multi Purpose Small Payload Rack(MSPR)

    NASA Astrophysics Data System (ADS)

    Mizutani, Yoshinobu; Takada, Satomi; Murata, Kosei; Ozawa, Daisaku; Kobayashi, Ryoji; Nakamura, Yasuhiro

    2010-09-01

    We are reporting summary of preliminary safety assessment for Multi Purpose Small Payload Rack(MSPR), which is one of the micro gravity experiment facilities that are being developed for the 2nd phase JEM utilization(JEM: Japanese Experiment Module) that will be launched on H-II Transfer Vehicle(HTV) 2nd flight in 2011. MSPR is used for multi-purpose micro-g experiment providing experimental spaces and work stations. MSPR has three experimental spaces; first, there is a space called Work Volume(WV) with capacity volume of approximately 350 litters, in which multiple resources including electricity, communication, and moving image functions can be used. Within this space, installation of devices can be done by simple, prompt attachment by Velcro and pins with high degree of flexibility. Second, there is Small Experiment Area(SEA), with capacity volume of approximately 70 litters, in which electricity, communication, and moving image functions can also be used in the same way as WV. These spaces protect experiment devices and specimens from contingent loads by the crewmembers. Third, there is Work Bench with area of 0.5 square meters, on which can be used for maintenance, inspection and data operations of installed devices, etc. This bench can be stored in the rack during contingency. Chamber for Combustion Experiment(CCE) that is planned to be installed in WV is a pressure-resistant experimental container that can be used to seal hazardous materials from combustion experiments. This CCE has double sealing design in chamber itself, which resist gas leakage under normal the temperature and pressure. Electricity, communication, moving image function can be used in the same way as WV. JAXA Phase 2 Safety Review Panel(SRP) has been held in April, 2010. For safety analysis of MSPR, hazards were identified based on Fault Tree Analysis methodology and then these hazards were classified into either eight ISS standard-type hazards or eight unique-type hazards that requires

  4. Safety assessment in plant layout design using indexing approach: implementing inherent safety perspective. Part 2-Domino Hazard Index and case study.

    PubMed

    Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2008-12-15

    The design of layout plans requires adequate assessment tools for the quantification of safety performance. The general focus of the present work is to introduce an inherent safety perspective at different points of the layout design process. In particular, index approaches for safety assessment and decision-making in the early stages of layout design are developed and discussed in this two-part contribution. Part 1 (accompanying paper) of the current work presents an integrated index approach for safety assessment of early plant layout. In the present paper (Part 2), an index for evaluation of the hazard related to the potential of domino effects is developed. The index considers the actual consequences of possible escalation scenarios and scores or ranks the subsequent accident propagation potential. The effects of inherent and passive protection measures are also assessed. The result is a rapid quantification of domino hazard potential that can provide substantial support for choices in the early stages of layout design. Additionally, a case study concerning selection among various layout options is presented and analyzed. The case study demonstrates the use and applicability of the indices developed in both parts of the current work and highlights the value of introducing inherent safety features early in layout design.

  5. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  6. International Harmonization of Food Safety Assessment of Pesticide Residues.

    PubMed

    Ambrus, Árpád

    2016-01-13

    This paper summarizes the development of principles and methods applied within the program of the FAO/WHO Codex Alimentarius during the past 50 years for the safety assessment of pesticide residues in food and feed and establishing maximum residue limits (MRLs) to promote free international trade and assure the safety of consumers. The role of major international organizations in this process, the FAO capacity building activities, and some problematic areas that require special attention are briefly described.

  7. Assessing the reliability of dose coefficients for exposure to radioiodine by members of the public, accounting for dosimetric and risk model uncertainties.

    PubMed

    Puncher, M; Zhang, W; Harrison, J D; Wakeford, R

    2017-06-26

    Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer

  8. Food Safety Practices Assessment Tool: An Innovative Way to Test Food Safety Skills among Individuals with Special Needs

    ERIC Educational Resources Information Center

    Carbone, Elena T.; Scarpati, Stanley E.; Pivarnik, Lori F.

    2013-01-01

    This article describes an innovative assessment tool designed to evaluate the effectiveness of a food safety skills curriculum for learners receiving special education services. As schools respond to the increased demand for training students with special needs about food safety, the need for effective curricula and tools is also increasing. A…

  9. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  10. Credibility of Uncertainty Analyses for 131-I Pathway Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, F O.; Anspaugh, L. R.; Apostoaei, A. I.

    2004-05-01

    We would like to make your readers aware of numerous concerns we have with respect to the paper by A. A. Simpkins and D. M. Hamby on Uncertainty in transport factors used to calculate historic dose from 131I releases at the Savannah River Site. The paper by Simpkins and Hamby concludes by saying their uncertainty analysis would add credibility to current dose reconstruction efforts of public exposures to historic releases of 131I from the operations at the Savannah River Site, yet we have found their paper to be afflicted with numerous errors in assumptions and methodology, which in turn leadmore » to grossly misleading conclusions. Perhaps the most egregious errors are their conclusions, which state that: a. the vegetable pathway, not the ingestion of fresh milk, was the main contributor to thyroid dose for exposure to 131I (even though dietary intake of vegetables was less in the past than at present), and b. the probability distribution assigned to the fraction of iodine released in the elemental form (Uniform 0, 0.6) is responsible for 64.6% of the total uncertainty in thyroid dose, given a unit release of 131I to the atmosphere. The assumptions used in the paper by Simpkins and Hamby lead to a large overestimate of the contamination of vegetables by airborne 131I. The interception by leafy and non-leafy vegetables of freshly deposited 131I is known to be highly dependent on the growth form of the crop and the standing crop biomass of leafy material. Unrealistic assumptions are made for losses of 131I from food processing, preparation, and storage prior to human consumption. These assumptions tend to bias their conclusions toward an overestimate of the amount of 131I retained by vegetation prior to consumption. For example, the generic assumption of a 6-d hold-up time is used for the loss from radioactive decay for the time period from harvest to human consumption of fruits, vegetables, and grains. We anticipate hold-up times of many weeks, if not months

  11. REDUCING UNCERTAINTY IN RISK ASSESSMENT USING MECHANISTIC DATA: ENHANCING THE U.S. EPA DEVELOPMENTAL NEUROTOXICITY TESTING GUIDELINES

    EPA Science Inventory

    SUMMARY: Mechanistic data should provide the Agency with a more accurate basis to estimate risk than do the Agency’s default assumptions (10x uncertainty factors, etc.), thereby improving risk assessment decisions. NTD is providing mechanistic data for toxicant effects on two maj...

  12. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  13. 3D gravity inversion and uncertainty assessment of basement relief via Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Pallero, J. L. G.; Fernández-Martínez, J. L.; Bonvalot, S.; Fudym, O.

    2017-04-01

    Nonlinear gravity inversion in sedimentary basins is a classical problem in applied geophysics. Although a 2D approximation is widely used, 3D models have been also proposed to better take into account the basin geometry. A common nonlinear approach to this 3D problem consists in modeling the basin as a set of right rectangular prisms with prescribed density contrast, whose depths are the unknowns. Then, the problem is iteratively solved via local optimization techniques from an initial model computed using some simplifications or being estimated using prior geophysical models. Nevertheless, this kind of approach is highly dependent on the prior information that is used, and lacks from a correct solution appraisal (nonlinear uncertainty analysis). In this paper, we use the family of global Particle Swarm Optimization (PSO) optimizers for the 3D gravity inversion and model appraisal of the solution that is adopted for basement relief estimation in sedimentary basins. Synthetic and real cases are illustrated, showing that robust results are obtained. Therefore, PSO seems to be a very good alternative for 3D gravity inversion and uncertainty assessment of basement relief when used in a sampling while optimizing approach. That way important geological questions can be answered probabilistically in order to perform risk assessment in the decisions that are made.

  14. Preclinical QT safety assessment: cross-species comparisons and human translation from an industry consortium.

    PubMed

    Holzgrefe, Henry; Ferber, Georg; Champeroux, Pascal; Gill, Michael; Honda, Masaki; Greiter-Wilke, Andrea; Baird, Theodore; Meyer, Olivier; Saulnier, Muriel

    2014-01-01

    In vivo models have been required to demonstrate relative cardiac safety, but model sensitivity has not been systematically investigated. Cross-species and human translation of repolarization delay, assessed as QT/QTc prolongation, has not been compared employing common methodologies across multiple species and sites. Therefore, the accurate translation of repolarization results within and between preclinical species, and to man, remains problematic. Six pharmaceutical companies entered into an informal consortium designed to collect high-resolution telemetered data in multiple species (dog; n=34, cynomolgus; n=37, minipig; n=12, marmoset; n=14, guinea pig; n=5, and man; n=57). All animals received vehicle and varying doses of moxifloxacin (3-100 mg/kg, p.o.) with telemetered ECGs (≥500 Hz) obtained for 20-24h post-dose. Individual probabilistic QT-RR relationships were derived for each subject. The rate-correction efficacies of the individual (QTca) and generic correction formulae (Bazett, Fridericia, and Van de Water) were objectively assessed as the mean squared slopes of the QTc-RR relationships. Normalized moxifloxacin QTca responses (Veh Δ%/μM) were derived for 1h centered on the moxifloxacin Tmax. All QT-RR ranges demonstrated probabilistic uncertainty; slopes varied distinctly by species where dog and human exhibited the lowest QT rate-dependence, which was much steeper in the cynomolgus and guinea pig. Incorporating probabilistic uncertainty, the normalized QTca-moxifloxacin responses were similarly conserved across all species, including man. The current results provide the first unambiguous evidence that all preclinical in vivo repolarization assays, when accurately modeled and evaluated, yield results that are consistent with the conservation of moxifloxacin-induced QT prolongation across all common preclinical species. Furthermore, these outcomes are directly transferable across all species including man. The consortium results indicate that the

  15. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  16. Opportunities to Apply the 3Rs in Safety Assessment Programs

    PubMed Central

    Sewell, Fiona; Edwards, Joanna; Prior, Helen; Robinson, Sally

    2016-01-01

    Abstract Before a potential new medicine can be administered to humans it is essential that its safety is adequately assessed. Safety assessment in animals forms an integral part of this process, from early drug discovery and initial candidate selection to the program of recommended regulatory tests in animals. The 3Rs (replacement, reduction, and refinement of animals in research) are integrated in the current regulatory requirements and expectations and, in the EU, provide a legal and ethical framework for in vivo research to ensure the scientific objectives are met whilst minimizing animal use and maintaining high animal welfare standards. Though the regulations are designed to uncover potential risks, they are intended to be flexible, so that the most appropriate approach can be taken for an individual product. This article outlines current and future opportunities to apply the 3Rs in safety assessment programs for pharmaceuticals, and the potential (scientific, financial, and ethical) benefits to the industry, across the drug discovery and development process. For example, improvements to, or the development of, novel, early screens (e.g., in vitro, in silico, or nonmammalian screens) designed to identify compounds with undesirable characteristics earlier in development have the potential to reduce late-stage attrition by improving the selection of compounds that require regulatory testing in animals. Opportunities also exist within the current regulatory framework to simultaneously reduce and/or refine animal use and improve scientific outcomes through improvements to technical procedures and/or adjustments to study designs. It is important that approaches to safety assessment are continuously reviewed and challenged to ensure they are science-driven and predictive of relevant effects in humans. PMID:28053076

  17. Assessment of patient safety culture in Palestinian public hospitals.

    PubMed

    Hamdan, Motasem; Saleem, Abed Alra'oof

    2013-04-01

    To assess the prevalent patient safety culture in Palestinian public hospitals. A cross-sectional design, Arabic translated version of the Hospital Survey on Patient Safety Culture was used. All the 11 general public hospitals in the West Bank. A total of 1460 clinical and non-clinical hospital staff. No. Twelve patient safety culture composites and 2 outcome variables (patient safety grade and events reported in the past year) were measured. Most of the participants were nurses and physicians (69.2%) with direct contact with patients (92%), mainly employed in medical/surgical units (55.1%). The patient safety composites with the highest positive scores were teamwork within units (71%), organizational learning and continuous improvement (62%) and supervisor/manager expectations and actions promoting patient safety (56%). The composites with the lowest scores were non-punitive response to error (17%), frequency of events reported (35%), communication openness (36%), hospital management support for patient safety (37%) and staffing (38%). Although 53.2% of the respondents did not report any event in the past year, 63.5% rated patient safety level as 'excellent/very good'. Significant differences in patient safety scores and outcome variables were found between hospitals of different size and in relation to staff positions and work hours. This study highlights the existence of a punitive and blame culture, under-reporting of events, lack of communication openness and inadequate management support that are key challenges for patient safe hospital care. The baseline survey results are valuable for designing and implementing the patient safety program and for measuring future progress.

  18. A Study on Urban Road Traffic Safety Based on Matter Element Analysis

    PubMed Central

    Hu, Qizhou; Zhou, Zhuping; Sun, Xu

    2014-01-01

    This paper examines a new evaluation of urban road traffic safety based on a matter element analysis, avoiding the difficulties found in other traffic safety evaluations. The issue of urban road traffic safety has been investigated through the matter element analysis theory. The chief aim of the present work is to investigate the features of urban road traffic safety. Emphasis was placed on the construction of a criterion function by which traffic safety achieved a hierarchical system of objectives to be evaluated. The matter element analysis theory was used to create the comprehensive appraisal model of urban road traffic safety. The technique was used to employ a newly developed and versatile matter element analysis algorithm. The matter element matrix solves the uncertainty and incompatibility of the evaluated factors used to assess urban road traffic safety. The application results showed the superiority of the evaluation model and a didactic example was included to illustrate the computational procedure. PMID:25587267

  19. Simulating Geriatric Home Safety Assessments in a Three-Dimensional Virtual World

    ERIC Educational Resources Information Center

    Andrade, Allen D.; Cifuentes, Pedro; Mintzer, Michael J.; Roos, Bernard A.; Anam, Ramanakumar; Ruiz, Jorge G.

    2012-01-01

    Virtual worlds could offer inexpensive and safe three-dimensional environments in which medical trainees can learn to identify home safety hazards. Our aim was to evaluate the feasibility, usability, and acceptability of virtual worlds for geriatric home safety assessments and to correlate performance efficiency in hazard identification with…

  20. Incorporating uncertainty in predictive species distribution modelling.

    PubMed

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  1. The in-depth safety assessment (ISA) pilot projects in Ukraine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kot, C. A.

    1998-02-10

    Ukraine operates pressurized water reactors of the Soviet-designed type, VVER. All Ukrainian plants are currently operating with annually renewable permits until they update their safety analysis reports (SARs). After approval of the SARS by the Ukrainian Nuclear Regulatory Authority, the plants will be granted longer-term operating licenses. In September 1995, the Nuclear Regulatory Authority and the Government Nuclear Power Coordinating Committee of Ukraine issued a new contents requirement for the safety analysis reports of VVERs in Ukraine. It contains requirements in three major areas: design basis accident (DBA) analysis, probabilistic risk assessment (PRA), and beyond design-basis accident (BDBA) analysis. Themore » DBA requirements are an expanded version of the older SAR requirements. The last two requirements, on PRA and BDBA, are new. The US Department of Energy (USDOE), through the International Nuclear Safety Program (INSP), has initiated an assistance and technology transfer program to Ukraine to assist their nuclear power stations in developing a Western-type technical basis for the new SARS. USDOE sponsored In-Depth Safety Assessments (ISAs) have been initiated at three pilot nuclear reactor units in Ukraine, South Ukraine Unit 1, Zaporizhzhya Unit 5, and Rivne Unit 1. USDOE/INSP have structured the ISA program in such a way as to provide maximum assistance and technology transfer to Ukraine while encouraging and supporting the Ukrainian plants to take the responsibility and initiative and to perform the required assessments.« less

  2. Portable Nanoparticle-Based Sensors for Food Safety Assessment

    PubMed Central

    Bülbül, Gonca; Hayat, Akhtar; Andreescu, Silvana

    2015-01-01

    The use of nanotechnology-derived products in the development of sensors and analytical measurement methodologies has increased significantly over the past decade. Nano-based sensing approaches include the use of nanoparticles (NPs) and nanostructures to enhance sensitivity and selectivity, design new detection schemes, improve sample preparation and increase portability. This review summarizes recent advancements in the design and development of NP-based sensors for assessing food safety. The most common types of NPs used to fabricate sensors for detection of food contaminants are discussed. Selected examples of NP-based detection schemes with colorimetric and electrochemical detection are provided with focus on sensors for the detection of chemical and biological contaminants including pesticides, heavy metals, bacterial pathogens and natural toxins. Current trends in the development of low-cost portable NP-based technology for rapid assessment of food safety as well as challenges for practical implementation and future research directions are discussed. PMID:26690169

  3. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  4. Characterizing uncertainty and variability in physiologically based pharmacokinetic models: state of the science and needs for research and implementation.

    PubMed

    Barton, Hugh A; Chiu, Weihsueh A; Setzer, R Woodrow; Andersen, Melvin E; Bailer, A John; Bois, Frédéric Y; Dewoskin, Robert S; Hays, Sean; Johanson, Gunnar; Jones, Nancy; Loizou, George; Macphail, Robert C; Portier, Christopher J; Spendiff, Martin; Tan, Yu-Mei

    2007-10-01

    Physiologically based pharmacokinetic (PBPK) models are used in mode-of-action based risk and safety assessments to estimate internal dosimetry in animals and humans. When used in risk assessment, these models can provide a basis for extrapolating between species, doses, and exposure routes or for justifying nondefault values for uncertainty factors. Characterization of uncertainty and variability is increasingly recognized as important for risk assessment; this represents a continuing challenge for both PBPK modelers and users. Current practices show significant progress in specifying deterministic biological models and nondeterministic (often statistical) models, estimating parameters using diverse data sets from multiple sources, using them to make predictions, and characterizing uncertainty and variability of model parameters and predictions. The International Workshop on Uncertainty and Variability in PBPK Models, held 31 Oct-2 Nov 2006, identified the state-of-the-science, needed changes in practice and implementation, and research priorities. For the short term, these include (1) multidisciplinary teams to integrate deterministic and nondeterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through improved documentation of model structure(s), parameter values, sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include (1) theoretical and practical methodological improvements for nondeterministic/statistical modeling; (2) better methods for evaluating alternative model structures; (3) peer-reviewed databases of parameters and covariates, and their distributions; (4) expanded coverage of PBPK models across chemicals with different properties; and (5) training and reference materials, such as cases studies, bibliographies/glossaries, model repositories, and enhanced

  5. Assessing uncertainty in SRTM elevations for global flood modelling

    NASA Astrophysics Data System (ADS)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  6. Transfer Standard Uncertainty Can Cause Inconclusive Inter-Laboratory Comparisons

    PubMed Central

    Wright, John; Toman, Blaza; Mickan, Bodo; Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens

    2016-01-01

    Inter-laboratory comparisons use the best available transfer standards to check the participants’ uncertainty analyses, identify underestimated uncertainty claims or unknown measurement biases, and improve the global measurement system. For some measurands, instability of the transfer standard can lead to an inconclusive comparison result. If the transfer standard uncertainty is large relative to a participating laboratory’s uncertainty, the commonly used standardized degree of equivalence ≤ 1 criterion does not always correctly assess whether a participant is working within their uncertainty claims. We show comparison results that demonstrate this issue and propose several criteria for assessing a comparison result as passing, failing, or inconclusive. We investigate the behavior of the standardized degree of equivalence and alternative comparison measures for a range of values of the transfer standard uncertainty relative to the individual laboratory uncertainty values. The proposed alternative criteria successfully discerned between passing, failing, and inconclusive comparison results for the cases we examined. PMID:28090123

  7. Infrared radiation and stealth characteristics prediction for supersonic aircraft with uncertainty

    NASA Astrophysics Data System (ADS)

    Pan, Xiaoying; Wang, Xiaojun; Wang, Ruixing; Wang, Lei

    2015-11-01

    The infrared radiation (IR) intensity is generally used to embody the stealth characteristics of a supersonic aircraft, which directly affects its survivability in warfare. Under such circumstances, the research on IR signature as an important branch of stealth technology is significant to overcome this threat for survivability enhancement. Considering the existence of uncertainties in material and environment, the IR intensity is indeed a range rather than a specific value. In this paper, subjected to the properties of the IR, an analytic process containing the uncertainty propagation and the reliability evaluation is investigated when taking into account that the temperature of object, the atmospheric transmittance and the spectral emissivity of materials are all regarded as uncertain parameters. For one thing, the vertex method is used to analyze and estimate the dispersion of IR intensity; for another, the safety assessment of the stealth performance for aircraft is conducted by non-probabilistic reliability analysis. For the purpose of the comparison and verification, the Monte Carlo simulation is discussed as well. The validity, usage, and efficiency of the developed methodology are demonstrated by two application examples eventually.

  8. Making the message meaningful: a qualitative assessment of media promoting all-terrain vehicle safety.

    PubMed

    Brann, Maria; Mullins, Samantha Hope; Miller, Beverly K; Eoff, Shane; Graham, James; Aitken, Mary E

    2012-08-01

    Millions of all-terrain vehicles (ATV) are used around the world for recreation by both adults and youth. This increase in use has led to a substantial increase in the number of injuries and fatalities each year. Effective strategies for reducing this incidence are clearly needed; however, minimal research exists regarding effective educational interventions. This study was designed to assess rural ATV riders' preferences for and assessment of safety messages. 13 focus group discussions with youth and adult ATV riders were conducted. 88 formative research participants provided feedback on existing ATV safety materials, which was used to develop more useful ATV safety messages. 60 evaluative focus group participants critiqued the materials developed for this project. Existing ATV safety materials have limited effectiveness, in part because they may not address the content or design needs of the target population. ATV riders want educational and action-oriented safety messages that inform youth and adult riders about their responsibilities to learn, educate and implement safety behaviours (eg, appropriate-sized ATV, safety gear, solo riding, speed limits, riding locations). In addition, messages should be clear, realistic, visually appealing and easily accessible. Newly designed ATV safety materials using the acronym TRIPSS (training, ride off-road, impairment, plan ahead, safety gear, single rider) meet ATV riders' safety messaging needs. To reach a target population, it is crucial to include them in the development and assessment of safety messages. Germane to this particular study, ATV riders provided essential information for creating useful ATV safety materials.

  9. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Díez, C.J., E-mail: cj.diez@upm.es; Cabellos, O.; Instituto de Fusión Nuclear, Universidad Politécnica de Madrid, 28006 Madrid

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has tomore » be performed in order to analyse the limitations of using one-group uncertainties.« less

  10. Niches, models, and climate change: Assessing the assumptions and uncertainties

    PubMed Central

    Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.

    2009-01-01

    As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750

  11. LNG safety assessment evaluation methods : task 3 letter report.

    DOT National Transportation Integrated Search

    2016-07-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods ...

  12. Uncertainty in exposure to air pollution

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce

    2013-04-01

    To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.

  13. Accounting for uncertainty in pedotransfer functions in vulnerability assessments of pesticide leaching to groundwater.

    PubMed

    Stenemo, Fredrik; Jarvis, Nicholas

    2007-09-01

    A simulation tool for site-specific vulnerability assessments of pesticide leaching to groundwater was developed, based on the pesticide fate and transport model MACRO, parameterized using pedotransfer functions and reasonable worst-case parameter values. The effects of uncertainty in the pedotransfer functions on simulation results were examined for 48 combinations of soils, pesticides and application timings, by sampling pedotransfer function regression errors and propagating them through the simulation model in a Monte Carlo analysis. An uncertainty factor, f(u), was derived, defined as the ratio between the concentration simulated with no errors, c(sim), and the 80th percentile concentration for the scenario. The pedotransfer function errors caused a large variation in simulation results, with f(u) ranging from 1.14 to 1440, with a median of 2.8. A non-linear relationship was found between f(u) and c(sim), which can be used to account for parameter uncertainty by correcting the simulated concentration, c(sim), to an estimated 80th percentile value. For fine-textured soils, the predictions were most sensitive to errors in the pedotransfer functions for two parameters regulating macropore flow (the saturated matrix hydraulic conductivity, K(b), and the effective diffusion pathlength, d) and two water retention function parameters (van Genuchten's N and alpha parameters). For coarse-textured soils, the model was also sensitive to errors in the exponent in the degradation water response function and the dispersivity, in addition to K(b), but showed little sensitivity to d. To reduce uncertainty in model predictions, improved pedotransfer functions for K(b), d, N and alpha would therefore be most useful. 2007 Society of Chemical Industry

  14. Uncertainty prediction for PUB

    NASA Astrophysics Data System (ADS)

    Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.

    2003-04-01

    IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.

  15. A safety rule approach to surveillance and eradication of biological invasions

    Treesearch

    Denys Yemshanov; Robert G. Haight; Frank H. Koch; Robert Venette; Kala Studens; Ronald E. Fournier; Tom Swystun; Jean J. Turgeon; Yulin Gao

    2017-01-01

    Uncertainty about future spread of invasive organisms hinders planning of effective response measures. We present a two-stage scenario optimization model that accounts for uncertainty about the spread of an invader, and determines survey and eradication strategies that minimize the expected program cost subject to a safety rule for eradication success. The safety rule...

  16. Sources of uncertainty in annual forest inventory estimates

    Treesearch

    Ronald E. McRoberts

    2000-01-01

    Although design and estimation aspects of annual forest inventories have begun to receive considerable attention within the forestry and natural resources communities, little attention has been devoted to identifying the sources of uncertainty inherent in these systems or to assessing the impact of those uncertainties on the total uncertainties of inventory estimates....

  17. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  18. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    NASA Astrophysics Data System (ADS)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly

  19. Using nonlinear least squares to assess relative expression and its uncertainty in real-time qPCR studies.

    PubMed

    Tellinghuisen, Joel

    2016-03-01

    Relative expression ratios are commonly estimated in real-time qPCR studies by comparing the quantification cycle for the target gene with that for a reference gene in the treatment samples, normalized to the same quantities determined for a control sample. For the "standard curve" design, where data are obtained for all four of these at several dilutions, nonlinear least squares can be used to assess the amplification efficiencies (AE) and the adjusted ΔΔCq and its uncertainty, with automatic inclusion of the effect of uncertainty in the AEs. An algorithm is illustrated for the KaleidaGraph program. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Assessing medical students' perceptions of patient safety: the medical student safety attitudes and professionalism survey.

    PubMed

    Liao, Joshua M; Etchegaray, Jason M; Williams, S Tyler; Berger, David H; Bell, Sigall K; Thomas, Eric J

    2014-02-01

    To develop and test the psychometric properties of a survey to measure students' perceptions about patient safety as observed on clinical rotations. In 2012, the authors surveyed 367 graduating fourth-year medical students at three U.S. MD-granting medical schools. They assessed the survey's reliability and construct and concurrent validity. They examined correlations between students' perceptions of organizational cultural factors, organizational patient safety measures, and students' intended safety behaviors. They also calculated percent positive scores for cultural factors. Two hundred twenty-eight students (62%) responded. Analyses identified five cultural factors (teamwork culture, safety culture, error disclosure culture, experiences with professionalism, and comfort expressing professional concerns) that had construct validity, concurrent validity, and good reliability (Cronbach alphas > 0.70). Across schools, percent positive scores for safety culture ranged from 28% (95% confidence interval [CI], 13%-43%) to 64% (30%-98%), while those for teamwork culture ranged from 47% (32%-62%) to 74% (66%-81%). They were low for error disclosure culture (range: 10% [0%-20%] to 27% [20%-35%]), experiences with professionalism (range: 7% [0%-15%] to 23% [16%-30%]), and comfort expressing professional concerns (range: 17% [5%-29%] to 38% [8%-69%]). Each cultural factor correlated positively with perceptions of overall patient safety as observed in clinical rotations (r = 0.37-0.69, P < .05) and at least one safety behavioral intent item. This study provided initial evidence for the survey's reliability and validity and illustrated its applicability for determining whether students' clinical experiences exemplify positive patient safety environments.

  1. DEMOGRAPHIC UNCERTAINTY IN ECOLOGICAL RISK ASSESSMENTS. (R825347)

    EPA Science Inventory

    We built a Ricker's model incorporating demographic stochasticity to simulate the effects of demographic uncertainty on responses of gray-tailed vole (Microtus canicaudus) populations to pesticide applications. We constructed models with mark-recapture data collected from populat...

  2. County-Level Climate Uncertainty for Risk Assessments: Volume 14 Appendix M - Historical Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. County-Level Climate Uncertainty for Risk Assessments: Volume 15 Appendix N - Forecast Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  4. County-Level Climate Uncertainty for Risk Assessments: Volume 12 Appendix K - Historical Rel. Humidity.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  5. County-Level Climate Uncertainty for Risk Assessments: Volume 17 Appendix P - Forecast Soil Moisture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  6. County-Level Climate Uncertainty for Risk Assessments: Volume 16 Appendix O - Historical Soil Moisture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  7. County-Level Climate Uncertainty for Risk Assessments: Volume 27 Appendix Z - Forecast Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  8. County-Level Climate Uncertainty for Risk Assessments: Volume 26 Appendix Y - Historical Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  9. Safety risk assessment using analytic hierarchy process (AHP) during planning and budgeting of construction projects.

    PubMed

    Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat

    2013-09-01

    The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  10. Challenges Ahead for Nuclear Facility Site-Specific Seismic Hazard Assessment in France: The Alternative Energies and the Atomic Energy Commission (CEA) Vision

    NASA Astrophysics Data System (ADS)

    Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.

    2017-09-01

    Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a

  11. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    NASA Astrophysics Data System (ADS)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  12. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  13. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  14. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  15. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  16. A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems

    DOT National Transportation Integrated Search

    2009-07-01

    This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...

  17. Safety assessment of foods from genetically modified crops in countries with developing economies.

    PubMed

    Delaney, Bryan

    2015-12-01

    Population growth particularly in countries with developing economies will result in a need to increase food production by 70% by the year 2050. Biotechnology has been utilized to produce genetically modified (GM) crops for insect and weed control with benefits including increased crop yield and will also be used in emerging countries. A multicomponent safety assessment paradigm has been applied to individual GM crops to determine whether they as safe as foods from non-GM crops. This paper reviews methods to assess the safety of foods from GM crops for safe consumption from the first generation of GM crops. The methods can readily be applied to new products developed within country and this paper will emphasize the concept of data portability; that safety data produced in one geographic location is suitable for safety assessment regardless of where it is utilized. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Use of evidential reasoning and AHP to assess regional industrial safety.

    PubMed

    Chen, Zhichao; Chen, Tao; Qu, Zhuohua; Yang, Zaili; Ji, Xuewei; Zhou, Yi; Zhang, Hui

    2018-01-01

    China's fast economic growth contributes to the rapid development of its urbanization process, and also renders a series of industrial accidents, which often cause loss of life, damage to property and environment, thus requiring the associated risk analysis and safety control measures to be implemented in advance. However, incompleteness of historical failure data before the occurrence of accidents makes it difficult to use traditional risk analysis approaches such as probabilistic risk analysis in many cases. This paper aims to develop a new methodology capable of assessing regional industrial safety (RIS) in an uncertain environment. A hierarchical structure for modelling the risks influencing RIS is first constructed. The hybrid of evidential reasoning (ER) and Analytical Hierarchy Process (AHP) is then used to assess the risks in a complementary way, in which AHP is hired to evaluate the weight of each risk factor and ER is employed to synthesise the safety evaluations of the investigated region(s) against the risk factors from the bottom to the top level in the hierarchy. The successful application of the hybrid approach in a real case analysis of RIS in several major districts of Beijing (capital of China) demonstrates its feasibility as well as provides risk analysts and safety engineers with useful insights on effective solutions to comprehensive risk assessment of RIS in metropolitan cities. The contribution of this paper is made by the findings on the comparison of risk levels of RIS at different regions against various risk factors so that best practices from the good performer(s) can be used to improve the safety of the others.

  19. Safety assessment in the urban park environment in Alborz Province, Iran.

    PubMed

    Oostakhan, Morteza; Babaei, Aliakbar

    2013-01-01

    Urban parks, as one of the recreational and sports sectors, could cause serious injuries among different ages if the safety issues in their design are not considered. These injuries can result from the equipment in the park, including play and sports equipment, or even from environmental factors, too. Lack of safety benchmark in parks will impact on the development of future proposals. In this article, attempts are made to survey the important safety factors in the urban parks including playgrounds, fitness equipment, pedestrian surface and environmental factors into a risk assessment. Hence, a checklist of safety factors was used. A Yes or No descriptor was allocated to any factor for determining safety level. The study also suggests recommendations for future planning concerning existing failures for designers. It was found that the safety level of the regional and local parks differ from each other.

  20. Food and Feed Safety Assessment: The Importance of Proper Sampling.

    PubMed

    Kuiper, Harry A; Paoletti, Claudia

    2015-03-24

    The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.

  1. Food and feed safety assessment: the importance of proper sampling.

    PubMed

    Kuiper, Harry A; Paoletti, Claudia

    2015-01-01

    The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.

  2. Development and application of a safety assessment methodology for waste disposals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, R.H.; Torres, C.; Schaller, K.H.

    1996-12-31

    As part of a European Commission funded research programme, QuantiSci (formerly the Environmental Division of Intera Information Technologies) and Instituto de Medio Ambiente of the Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (IMA/CIEMAT) have developed and applied a comprehensive, yet practicable, assessment methodology for post-disposal safety assessment of land-based disposal facilities. This Safety Assessment Comparison (SACO) Methodology employs a systematic approach to the collection, evaluation and use of waste and disposal system data. It can be used to assess engineered barrier performance, the attenuating properties of host geological formations, and the long term impacts of a facility on the environmentmore » and human health, as well as allowing the comparison of different disposal options for radioactive, mixed and non-radioactive wastes. This paper describes the development of the methodology and illustrates its use.« less

  3. Assessment of uncertainty in discrete fracture network modeling using probabilistic distribution method.

    PubMed

    Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan

    2017-11-01

    There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .

  4. Safety envelope for load tolerance of structural element design based on multi-stage testing

    DOE PAGES

    Park, Chanyoung; Kim, Nam H.

    2016-09-06

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  5. Sources of uncertainty in hydrological climate impact assessment: a cross-scale study

    NASA Astrophysics Data System (ADS)

    Hattermann, F. F.; Vetter, T.; Breuer, L.; Su, Buda; Daggupati, P.; Donnelly, C.; Fekete, B.; Flörke, F.; Gosling, S. N.; Hoffmann, P.; Liersch, S.; Masaki, Y.; Motovilov, Y.; Müller, C.; Samaniego, L.; Stacke, T.; Wada, Y.; Yang, T.; Krysnaova, V.

    2018-01-01

    Climate change impacts on water availability and hydrological extremes are major concerns as regards the Sustainable Development Goals. Impacts on hydrology are normally investigated as part of a modelling chain, in which climate projections from multiple climate models are used as inputs to multiple impact models, under different greenhouse gas emissions scenarios, which result in different amounts of global temperature rise. While the goal is generally to investigate the relevance of changes in climate for the water cycle, water resources or hydrological extremes, it is often the case that variations in other components of the model chain obscure the effect of climate scenario variation. This is particularly important when assessing the impacts of relatively lower magnitudes of global warming, such as those associated with the aspirational goals of the Paris Agreement. In our study, we use ANOVA (analyses of variance) to allocate and quantify the main sources of uncertainty in the hydrological impact modelling chain. In turn we determine the statistical significance of different sources of uncertainty. We achieve this by using a set of five climate models and up to 13 hydrological models, for nine large scale river basins across the globe, under four emissions scenarios. The impact variable we consider in our analysis is daily river discharge. We analyze overall water availability and flow regime, including seasonality, high flows and low flows. Scaling effects are investigated by separately looking at discharge generated by global and regional hydrological models respectively. Finally, we compare our results with other recently published studies. We find that small differences in global temperature rise associated with some emissions scenarios have mostly significant impacts on river discharge—however, climate model related uncertainty is so large that it obscures the sensitivity of the hydrological system.

  6. A Framework for Assessing Uncertainty Associated with Human Health Risks from MSW Landfill Leachate Contamination.

    PubMed

    Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Singh, Jitendra

    2017-07-01

    Landfilling is a cost-effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long-term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses. © 2016 Society for Risk Analysis.

  7. Biomarkers: Dynamic "Tools" for Health and Safety Risk Assessment

    EPA Science Inventory

    Today informational flow from biomarkers contributes importantly to various types of health effects research, risk assessment and risk management decisions that impact, or have the potential to impact, public health and safety. Therefore, dependent upon the nature of the health r...

  8. Impact of Climate Change on high and low flows across Great Britain: a temporal analysis and uncertainty assessment.

    NASA Astrophysics Data System (ADS)

    Beevers, Lindsay; Collet, Lila

    2017-04-01

    Over the past decade there have been significant challenges to water management posed by both floods and droughts. In the UK, since 2000 flooding has caused over £5Bn worth of damage, and direct costs from the recent drought (2011-12) are estimated to be between £70-165M, arising from impacts on public and industrial water supply. Projections of future climate change suggest an increase in temperature and precipitation trends which may exacerbate the frequency and severity of such hazards, but there is significant uncertainty associated with these projections. It thus becomes urgent to assess the possible impact of these changes on extreme flows and evaluate the uncertainties related to these projections, particularly changes in the seasonality of such hazards. This paper aims to assess the changes in seasonality of peak and low flows across Great Britain as a result of climate change. It is based on the Future Flow database; an 11-member ensemble of transient river flow projections from January 1951 to December 2098. We analyse the daily river flow over the baseline (1961-1990) and the 2080s (2069-2098) for 281 gauging stations. For each ensemble member, annual maxima (AMAX) and minima (AMIN) are extracted for both time periods for each gauging station. The month of the year the AMAX and AMIN occur respectively are recorded for each of the 30 years in the past and the future time periods. The uncertainty of the AMAX and AMIN occurrence temporally (monthly) is assessed across the 11 ensemble members, as well as the changes to this temporal signal between the baseline and the 2080s. Ultimately, this work gives a national picture (spatially) of high and low flows occurrence temporally and allows the assessment of possible changes in hydrological dynamics as a result of climate change in a statistical framework. Results will quantify the uncertainty related to the Climate Model parameters which are cascaded into the modelling chain. This study highlights the issues

  9. Input Uncertainty and its Implications on Parameter Assessment in Hydrologic and Hydroclimatic Modelling Studies

    NASA Astrophysics Data System (ADS)

    Chowdhury, S.; Sharma, A.

    2005-12-01

    present. SIMEX is based on theory that the trend in alternate parameters can be extrapolated back to the notional error free zone. We illustrate the utility of SIMEX in a synthetic rainfall-runoff modelling scenario and an application to study the dependence of uncertain distributed sea surface temperature anomalies with an indicator of the El Nino Southern Oscillation, the Southern Oscillation Index (SOI). The errors in rainfall data and its affect is explored using Sacramento rainfall runoff model. The rainfall uncertainty is assumed to be multiplicative and temporally invariant. The model used to relate the sea surface temperature anomalies (SSTA) to the SOI is assumed to be of a linear form. The nature of uncertainty in the SSTA is additive and varies with time. The SIMEX framework allows assessment of the relationship between the error free inputs and response. Cook, J.R., Stefanski, L. A., Simulation-Extrapolation Estimation in Parametric Measurement Error Models, Journal of the American Statistical Association, 89 (428), 1314-1328, 1994.

  10. Radiologist Uncertainty and the Interpretation of Screening

    PubMed Central

    Carney, Patricia A.; Elmore, Joann G.; Abraham, Linn A.; Gerrity, Martha S.; Hendrick, R. Edward; Taplin, Stephen H.; Barlow, William E.; Cutter, Gary R.; Poplack, Steven P.; D’Orsi, Carl J.

    2011-01-01

    Objective To determine radiologists’ reactions to uncertainty when interpreting mammography and the extent to which radiologist uncertainty explains variability in interpretive performance. Methods The authors used a mailed survey to assess demographic and clinical characteristics of radiologists and reactions to uncertainty associated with practice. Responses were linked to radiologists’ actual interpretive performance data obtained from 3 regionally located mammography registries. Results More than 180 radiologists were eligible to participate, and 139 consented for a response rate of 76.8%. Radiologist gender, more years interpreting, and higher volume were associated with lower uncertainty scores. Positive predictive value, recall rates, and specificity were more affected by reactions to uncertainty than sensitivity or negative predictive value; however, none of these relationships was statistically significant. Conclusion Certain practice factors, such as gender and years of interpretive experience, affect uncertainty scores. Radiologists’ reactions to uncertainty do not appear to affect interpretive performance. PMID:15155014

  11. Safety Assessment for a Surface Repository in the Chernobyl Exclusion Zone - Methodology for Assessing Disposal under Intervention Conditions - 13476

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, RWDF Buryakovka is still being operated but its maximum capacity is nearly reached. Plans for enlargement of the facility exist since more than 10 years but have not been implemented yet. In the framework of an European Commission Project DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safety analysis report (PSAR) based on the planned enlargement. Due to its history RWDF Buryakovka does notmore » fully comply with today's best international practices and the latest Ukrainian regulations in this area. The most critical aspects are its inventory of long-lived radionuclides, and the non-existent multi-barrier waste confinement system. A significant part of the project was dedicated, therefore, to the development of a methodology for the safety assessment taking into consideration the facility's special situation and to reach an agreement with all stakeholders involved in the later review and approval procedure of the safety analysis reports. Main aspect of the agreed methodology was to analyze the safety, not strictly based on regulatory requirements but on the assessment of the actual situation of the facility including its location within the Exclusion Zone. For both safety analysis reports, SAR and PSAR, the assessment of the long-term safety led to results that were either within regulatory limits or within the limits allowing for a specific situational evaluation by the regulator. (authors)« less

  12. Intolerance of uncertainty, causal uncertainty, causal importance, self-concept clarity and their relations to generalized anxiety disorder.

    PubMed

    Kusec, Andrea; Tallon, Kathleen; Koerner, Naomi

    2016-06-01

    Although numerous studies have provided support for the notion that intolerance of uncertainty plays a key role in pathological worry (the hallmark feature of generalized anxiety disorder (GAD)), other uncertainty-related constructs may also have relevance for the understanding of individuals who engage in pathological worry. Three constructs from the social cognition literature, causal uncertainty, causal importance, and self-concept clarity, were examined in the present study to assess the degree to which these explain unique variance in GAD, over and above intolerance of uncertainty. N = 235 participants completed self-report measures of trait worry, GAD symptoms, and uncertainty-relevant constructs. A subgroup was subsequently classified as low in GAD symptoms (n = 69) or high in GAD symptoms (n = 54) based on validated cut scores on measures of trait worry and GAD symptoms. In logistic regressions, only elevated intolerance of uncertainty and lower self-concept clarity emerged as unique correlates of high (vs. low) GAD symptoms. The possible role of self-concept uncertainty in GAD and the utility of integrating social cognition theories and constructs into clinical research on intolerance of uncertainty are discussed.

  13. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    NASA Technical Reports Server (NTRS)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  14. Assessment of spectral, misregistration, and spatial uncertainties inherent in the cross-calibration study

    USGS Publications Warehouse

    Chander, G.; Helder, D.L.; Aaron, David; Mishra, N.; Shrestha, A.K.

    2013-01-01

    Cross-calibration of satellite sensors permits the quantitative comparison of measurements obtained from different Earth Observing (EO) systems. Cross-calibration studies usually use simultaneous or near-simultaneous observations from several spaceborne sensors to develop band-by-band relationships through regression analysis. The investigation described in this paper focuses on evaluation of the uncertainties inherent in the cross-calibration process, including contributions due to different spectral responses, spectral resolution, spectral filter shift, geometric misregistrations, and spatial resolutions. The hyperspectral data from the Environmental Satellite SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY and the EO-1 Hyperion, along with the relative spectral responses (RSRs) from the Landsat 7 Enhanced Thematic Mapper (TM) Plus and the Terra Moderate Resolution Imaging Spectroradiometer sensors, were used for the spectral uncertainty study. The data from Landsat 5 TM over five representative land cover types (desert, rangeland, grassland, deciduous forest, and coniferous forest) were used for the geometric misregistrations and spatial-resolution study. The spectral resolution uncertainty was found to be within 0.25%, spectral filter shift within 2.5%, geometric misregistrations within 0.35%, and spatial-resolution effects within 0.1% for the Libya 4 site. The one-sigma uncertainties presented in this paper are uncorrelated, and therefore, the uncertainties can be summed orthogonally. Furthermore, an overall total uncertainty was developed. In general, the results suggested that the spectral uncertainty is more dominant compared to other uncertainties presented in this paper. Therefore, the effect of the sensor RSR differences needs to be quantified and compensated to avoid large uncertainties in cross-calibration results.

  15. Assessment of uncertainty in ROLO lunar irradiance for on-orbit calibration

    USGS Publications Warehouse

    Stone, T.C.; Kieffer, H.H.; Barnes, W.L.; Butler, J.J.

    2004-01-01

    A system to provide radiometric calibration of remote sensing imaging instruments on-orbit using the Moon has been developed by the US Geological Survey RObotic Lunar Observatory (ROLO) project. ROLO has developed a model for lunar irradiance which treats the primary geometric variables of phase and libration explicitly. The model fits hundreds of data points in each of 23 VNIR and 9 SWIR bands; input data are derived from lunar radiance images acquired by the project's on-site telescopes, calibrated to exoatmospheric radiance and converted to disk-equivalent reflectance. Experimental uncertainties are tracked through all stages of the data processing and modeling. Model fit residuals are ???1% in each band over the full range of observed phase and libration angles. Application of ROLO lunar calibration to SeaWiFS has demonstrated the capability for long-term instrument response trending with precision approaching 0.1% per year. Current work involves assessing the error in absolute responsivity and relative spectral response of the ROLO imaging systems, and propagation of error through the data reduction and modeling software systems with the goal of reducing the uncertainty in the absolute scale, now estimated at 5-10%. This level is similar to the scatter seen in ROLO lunar irradiance comparisons of multiple spacecraft instruments that have viewed the Moon. A field calibration campaign involving NASA and NIST has been initiated that ties the ROLO lunar measurements to the NIST (SI) radiometric scale.

  16. Urban transport safety assessment in akure based on corresponding performance indicators

    NASA Astrophysics Data System (ADS)

    Oye, Adedamola; Aderinlewo, Olufikayo; Croope, Silvana

    2013-03-01

    The level of safety of the transportation system in Akure, Nigeria was assessed by identifying the associated road safety problems and developing the corresponding safety performance indicators. These indicators were analysed with respect to accidents that occurred within the city from the year 2005 to 2009 based on the corresponding attributable risk measures. The results of the analysis showed the state of existing safety programs in Akure town. Six safety performance indicators were identified namely alcohol and drug use, excessive speeds, protection system (use of seat belts and helmets), use of day time running lights, state of vehicles (passive safety) and road condition. These indicators were used to determine the percentage of injury accidents as follows: 83.33% and 86.36% for years 2005 and 2006 respectively, 81.46% for year 2007 while years 2008 and 2009 had 82.86% and 78.12% injury accidents respectively.

  17. Health risks of climate change: an assessment of uncertainties and its implications for adaptation policies.

    PubMed

    Wardekker, J Arjan; de Jong, Arie; van Bree, Leendert; Turkenburg, Wim C; van der Sluijs, Jeroen P

    2012-09-19

    Projections of health risks of climate change are surrounded with uncertainties in knowledge. Understanding of these uncertainties will help the selection of appropriate adaptation policies. We made an inventory of conceivable health impacts of climate change, explored the type and level of uncertainty for each impact, and discussed its implications for adaptation policy. A questionnaire-based expert elicitation was performed using an ordinal scoring scale. Experts were asked to indicate the level of precision with which health risks can be estimated, given the present state of knowledge. We assessed the individual scores, the expertise-weighted descriptive statistics, and the argumentation given for each score. Suggestions were made for how dealing with uncertainties could be taken into account in climate change adaptation policy strategies. The results showed that the direction of change could be indicated for most anticipated health effects. For several potential effects, too little knowledge exists to indicate whether any impact will occur, or whether the impact will be positive or negative. For several effects, rough 'order-of-magnitude' estimates were considered possible. Factors limiting health impact quantification include: lack of data, multi-causality, unknown impacts considering a high-quality health system, complex cause-effect relations leading to multi-directional impacts, possible changes of present-day response-relations, and difficulties in predicting local climate impacts. Participants considered heat-related mortality and non-endemic vector-borne diseases particularly relevant for climate change adaptation. For possible climate related health impacts characterised by ignorance, adaptation policies that focus on enhancing the health system's and society's capability of dealing with possible future changes, uncertainties and surprises (e.g. through resilience, flexibility, and adaptive capacity) are most appropriate. For climate related health

  18. Chapter 8: Uncertainty assessment for quantifying greenhouse gas sources and sinks

    Treesearch

    Jay Breidt; Stephen M. Ogle; Wendy Powers; Coeli Hoover

    2014-01-01

    Quantifying the uncertainty of greenhouse gas (GHG) emissions and reductions from agriculture and forestry practices is an important aspect of decision�]making for farmers, ranchers and forest landowners as the uncertainty range for each GHG estimate communicates our level of confidence that the estimate reflects the actual balance of GHG exchange between...

  19. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation.

    PubMed

    Zarr, Robert R

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors.

  20. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation

    PubMed Central

    Zarr, Robert R.

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors. PMID:27134779

  1. Safety assessment in schools: beyond risk: the role of child psychiatrists and other mental health professionals.

    PubMed

    Rappaport, Nancy; Pollack, William S; Flaherty, Lois T; Schwartz, Sarah E O; McMickens, Courtney

    2015-04-01

    This article presents an overview of a comprehensive school safety assessment approach for students whose behavior raises concern about their potential for targeted violence. Case vignettes highlight the features of 2 youngsters who exemplify those seen, the comprehensive nature of the assessment, and the kind of recommendations that enhance a student's safety, connection, well-being; engage families; and share responsibility of assessing safety with the school. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  3. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  4. Quantification of allyl hexanoate in pineapple beverages and yogurts as a case study to characterise a source of uncertainty in dietary exposure assessment to flavouring substances.

    PubMed

    Raffo, A; D'Aloise, A; Magrì, A D; Leclercq, C

    2012-01-01

    One source of uncertainty in the estimation of dietary exposure to flavouring substances is the uncertainty in the occurrence and concentration levels of these substances naturally present or added to foodstuffs. The aim of this study was to assess the variability of concentration levels of allyl hexanoate, considered as a case study, in two main food categories to which it is often added: pineapple juice-based beverages and yogurts containing pineapple. Thirty-four beverages and 29 yogurts, with pineapple fruit or juice and added flavourings declared as ingredients on the package, were purchased from the local market (in Rome) and analysed. Analytical methods based on the stir bar sorptive extraction (SBSE) technique for the isolation of the target analyte, and on GC-MS analysis for final determination, were developed for the two food categories. In beverages, allyl hexanoate concentrations ranged from less than 0.01 to 16.71 mg l(-1), whereas in yogurts they ranged from 0.02 to 89.41 mg kg(-1). Average concentrations in beverages and yogurts with pineapple as the main fruit ingredient (1.91 mg l(-1) for beverages, 9.61 mg kg(-1) for yogurts) were in fair agreement with average use level data reported from industry surveys for the relevant food categories (4.5 and 6.0 mg kg(-1), respectively). Within the group of yogurts a single product was found to contain a level of allyl hexanoate more than 10-fold higher than the average reported use level. The screening techniques developed by the European Food Safety Authority (EFSA) using use level data provided by industry gave estimates of exposure that were of the same order of magnitude as the estimates obtained for regular consumers who would be loyal to the pineapple yogurt and beverage products containing the highest observed concentration of the substance of interest. In this specific case the uncertainty in the results obtained with the use of standard screening techniques for exposure assessment based on industry

  5. Use of evidential reasoning and AHP to assess regional industrial safety

    PubMed Central

    Chen, Zhichao; Chen, Tao; Qu, Zhuohua; Ji, Xuewei; Zhou, Yi; Zhang, Hui

    2018-01-01

    China’s fast economic growth contributes to the rapid development of its urbanization process, and also renders a series of industrial accidents, which often cause loss of life, damage to property and environment, thus requiring the associated risk analysis and safety control measures to be implemented in advance. However, incompleteness of historical failure data before the occurrence of accidents makes it difficult to use traditional risk analysis approaches such as probabilistic risk analysis in many cases. This paper aims to develop a new methodology capable of assessing regional industrial safety (RIS) in an uncertain environment. A hierarchical structure for modelling the risks influencing RIS is first constructed. The hybrid of evidential reasoning (ER) and Analytical Hierarchy Process (AHP) is then used to assess the risks in a complementary way, in which AHP is hired to evaluate the weight of each risk factor and ER is employed to synthesise the safety evaluations of the investigated region(s) against the risk factors from the bottom to the top level in the hierarchy. The successful application of the hybrid approach in a real case analysis of RIS in several major districts of Beijing (capital of China) demonstrates its feasibility as well as provides risk analysts and safety engineers with useful insights on effective solutions to comprehensive risk assessment of RIS in metropolitan cities. The contribution of this paper is made by the findings on the comparison of risk levels of RIS at different regions against various risk factors so that best practices from the good performer(s) can be used to improve the safety of the others. PMID:29795593

  6. Assessment of contributions to patient safety knowledge by the Agency for Healthcare Research and Quality-funded patient safety projects.

    PubMed

    Sorbero, Melony E S; Ricci, Karen A; Lovejoy, Susan; Haviland, Amelia M; Smith, Linda; Bradley, Lily A; Hiatt, Liisa; Farley, Donna O

    2009-04-01

    To characterize the activities of projects funded in Agency for Healthcare Research and Quality (AHRQ)'s patient safety portfolio and assess their aggregate potential to contribute to knowledge development. Information abstracted from proposals for projects funded in AHRQ's patient safety portfolio, information on safety practices from the AHRQ Evidence Report on Patient Safety Practices, and products produced by the projects. This represented one part of the process evaluation conducted as part of a longitudinal evaluation based on the Context–Input–Process–Product model. The 234 projects funded through AHRQ's patient safety portfolio examined a wide variety of patient safety issues and extended their work beyond the hospital setting to less studied parts of the health care system. Many of the projects implemented and tested practices for which the patient safety evidence report identified a need for additional evidence. The funded projects also generated a substantial body of new patient safety knowledge through a growing number of journal articles and other products. The projects funded in AHRQ's patient safety portfolio have the potential to make substantial contributions to the knowledge base on patient safety. The full value of this new knowledge remains to be confirmed through the synthesis of results

  7. Uncertainty in recharge estimation: impact on groundwater vulnerability assessments for the Pearl Harbor Basin, O'ahu, Hawai'i, U.S.A.

    NASA Astrophysics Data System (ADS)

    Giambelluca, Thomas W.; Loague, Keith; Green, Richard E.; Nullet, Michael A.

    1996-06-01

    In this paper, uncertainty in recharge estimates is investigated relative to its impact on assessments of groundwater contamination vulnerability using a relatively simple pesticide mobility index, attenuation factor (AF). We employ a combination of first-order uncertainty analysis (FOUA) and sensitivity analysis to investigate recharge uncertainties for agricultural land on the island of O'ahu, Hawai'i, that is currently, or has been in the past, under sugarcane or pineapple cultivation. Uncertainty in recharge due to recharge component uncertainties is 49% of the mean for sugarcane and 58% of the mean for pineapple. The components contributing the largest amounts of uncertainty to the recharge estimate are irrigation in the case of sugarcane and precipitation in the case of pineapple. For a suite of pesticides formerly or currently used in the region, the contribution to AF uncertainty of recharge uncertainty was compared with the contributions of other AF components: retardation factor (RF), a measure of the effects of sorption; soil-water content at field capacity (ΘFC); and pesticide half-life (t1/2). Depending upon the pesticide, the contribution of recharge to uncertainty ranks second or third among the four AF components tested. The natural temporal variability of recharge is another source of uncertainty in AF, because the index is calculated using the time-averaged recharge rate. Relative to the mean, recharge variability is 10%, 44%, and 176% for the annual, monthly, and daily time scales, respectively, under sugarcane, and 31%, 112%, and 344%, respectively, under pineapple. In general, uncertainty in AF associated with temporal variability in recharge at all time scales exceeds AF. For chemicals such as atrazine or diuron under sugarcane, and atrazine or bromacil under pineapple, the range of AF uncertainty due to temporal variability in recharge encompasses significantly higher levels of leaching potential at some locations than that indicated by the

  8. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  9. Uncertainties in selected river water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2007-02-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For

  10. Uncertainties in selected surface water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2006-09-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended

  11. Error disclosure: a new domain for safety culture assessment.

    PubMed

    Etchegaray, Jason M; Gallagher, Thomas H; Bell, Sigall K; Dunlap, Ben; Thomas, Eric J

    2012-07-01

    To (1) develop and test survey items that measure error disclosure culture, (2) examine relationships among error disclosure culture, teamwork culture and safety culture and (3) establish predictive validity for survey items measuring error disclosure culture. All clinical faculty from six health institutions (four medical schools, one cancer centre and one health science centre) in The University of Texas System were invited to anonymously complete an electronic survey containing questions about safety culture and error disclosure. The authors found two factors to measure error disclosure culture: one factor is focused on the general culture of error disclosure and the second factor is focused on trust. Both error disclosure culture factors were unique from safety culture and teamwork culture (correlations were less than r=0.85). Also, error disclosure general culture and error disclosure trust culture predicted intent to disclose a hypothetical error to a patient (r=0.25, p<0.001 and r=0.16, p<0.001, respectively) while teamwork and safety culture did not predict such an intent (r=0.09, p=NS and r=0.12, p=NS). Those who received prior error disclosure training reported significantly higher levels of error disclosure general culture (t=3.7, p<0.05) and error disclosure trust culture (t=2.9, p<0.05). The authors created and validated a new measure of error disclosure culture that predicts intent to disclose an error better than other measures of healthcare culture. This measure fills an existing gap in organisational assessments by assessing transparent communication after medical error, an important aspect of culture.

  12. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7

  13. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  14. Uncertainty and extreme events in future climate and hydrologic projections for the Pacific Northwest: providing a basis for vulnerability and core/corridor assessments

    USGS Publications Warehouse

    Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.

    2014-01-01

    The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate

  15. Uncertainty Quantification in High Throughput Screening ...

    EPA Pesticide Factsheets

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  16. Uncertainty and variability in computational and mathematical models of cardiac physiology.

    PubMed

    Mirams, Gary R; Pathmanathan, Pras; Gray, Richard A; Challenor, Peter; Clayton, Richard H

    2016-12-01

    Mathematical and computational models of cardiac physiology have been an integral component of cardiac electrophysiology since its inception, and are collectively known as the Cardiac Physiome. We identify and classify the numerous sources of variability and uncertainty in model formulation, parameters and other inputs that arise from both natural variation in experimental data and lack of knowledge. The impact of uncertainty on the outputs of Cardiac Physiome models is not well understood, and this limits their utility as clinical tools. We argue that incorporating variability and uncertainty should be a high priority for the future of the Cardiac Physiome. We suggest investigating the adoption of approaches developed in other areas of science and engineering while recognising unique challenges for the Cardiac Physiome; it is likely that novel methods will be necessary that require engagement with the mathematics and statistics community. The Cardiac Physiome effort is one of the most mature and successful applications of mathematical and computational modelling for describing and advancing the understanding of physiology. After five decades of development, physiological cardiac models are poised to realise the promise of translational research via clinical applications such as drug development and patient-specific approaches as well as ablation, cardiac resynchronisation and contractility modulation therapies. For models to be included as a vital component of the decision process in safety-critical applications, rigorous assessment of model credibility will be required. This White Paper describes one aspect of this process by identifying and classifying sources of variability and uncertainty in models as well as their implications for the application and development of cardiac models. We stress the need to understand and quantify the sources of variability and uncertainty in model inputs, and the impact of model structure and complexity and their consequences for

  17. DEVELOPMENT OF ADME DATA IN AGRICULTURAL CHEMICAL SAFETY ASSESSMENTS

    EPA Science Inventory

    DEVELOPMENT OF ADME DATA IN AGRICULTURAL CHEMICAL SAFETY ASSESSMENTS
    Pastoor, Timothy1, Barton, Hugh2
    1 Syngenta Crop Protection, Greensboro, NC, USA.
    2 EPA, Office of Research and Development-NHEERL, RTP, NC, USA.

    A multi-stakeholder series of discussions d...

  18. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals

    PubMed Central

    Severtson, Dolores J.

    2015-01-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings. PMID:26412960

  19. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals.

    PubMed

    Severtson, Dolores J

    2015-02-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings.

  20. Safety Assessment and Biological Effects of a New Cold Processed SilEmulsion for Dermatological Purpose

    PubMed Central

    Salgado, Ana; Gonçalves, Lídia; Pinto, Pedro C.; Urbano, Manuela; Ribeiro, Helena M.

    2013-01-01

    It is of crucial importance to evaluate the safety profile of the ingredients used in dermatological emulsions. A suitable equilibrium between safety and efficacy is a pivotal concern before the marketing of a dermatological product. The aim was to assess the safety and biological effects of a new cold processed silicone-based emulsion (SilEmulsion). The hazard, exposure, and dose-response assessment were used to characterize the risk for each ingredient. EpiSkin assay and human repeat insult patch tests were performed to compare the theoretical safety assessment to in vitro and in vivo data. The efficacy of the SilEmulsion was studied using biophysical measurements in human volunteers during 21 days. According to the safety assessment of the ingredients, 1,5-pentanediol was an ingredient of special concern since its margin of safety was below the threshold of 100 (36.53). EpiSkin assay showed that the tissue viability after the application of the SilEmulsion was 92 ± 6% and, thus considered nonirritant to the skin. The human studies confirmed that the SilEmulsion was not a skin irritant and did not induce any sensitization on the volunteers, being safe for human use. Moreover, biological effects demonstrated that the SilEmulsion increased both the skin hydration and skin surface lipids. PMID:24294598