Sample records for complete probabilistic risk

  1. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models.

  2. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models. PMID:29062288

  3. Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.P.; Stover, R.L.; Hashimoto, P.S.

    1989-01-01

    Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.

  4. On a true value of risk

    NASA Astrophysics Data System (ADS)

    Kozine, Igor

    2018-04-01

    The paper suggests looking on probabilistic risk quantities and concepts through the prism of accepting one of the views: whether a true value of risk exists or not. It is argued that discussions until now have been primarily focused on closely related topics that are different from the topic of the current paper. The paper examines operational consequences of adhering to each of the views and contrasts them. It is demonstrated that operational differences on how and what probabilistic measures can be assessed and how they can be interpreted appear tangible. In particular, this concerns prediction intervals, the use of Byes rule, models of complete ignorance, hierarchical models of uncertainty, assignment of probabilities over possibility space and interpretation of derived probabilistic measures. Behavioural implications of favouring the either view are also briefly described.

  5. Progress report on the Worldwide Earthquake Risk Management (WWERM) Program

    USGS Publications Warehouse

    Algermissen, S.T.; Hays, Walter W.; Krumpe, Paul R.

    1992-01-01

    Considerable progress has been made in the Worldwide Earthquake Risk Management (WWERM) Program since its initiation in late 1989 as a cooperative program of the Agency for International Development (AID), Office of U.S. Foreign Disaster Assistance (OFDA), and the U.S. Geological Survey. Probabilistic peak acceleration and peak Modified Mercalli intensity (MMI) maps have been prepared for Chile and for Sulawesi province in Indonesia. Earthquake risk (loss) studies for dwellings in Gorontalo, North Sulawesi, have been completed and risk studies for dwellings in selected areas of central Chile are underway. A special study of the effect of site response on earthquake ground motion estimation in central Chile has also been completed and indicates that site response may modify the ground shaking by as much as plus or minus two units of MMI. A program for the development of national probabilistic ground motion maps for the Philippines is now underway and pilot studies of earthquake ground motion and risk are being planned for Morocco.

  6. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  7. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  8. Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.

    PubMed

    Wilhelm, C J; Mitchell, S H

    2008-10-01

    Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.

  9. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  10. The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.

    2010-01-01

    HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.

  11. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  12. Risk taking and adult attention deficit/hyperactivity disorder: A gap between real life behavior and experimental decision making.

    PubMed

    Pollak, Yehuda; Shalit, Reut; Aran, Adi

    2018-01-01

    Adults with attention deficit/hyperactivity disorder (ADHD) are prone to suboptimal decision making and risk taking. The aim of this study was to test performance on a theoretically-based probabilistic decision making task in well-characterized adults with and without ADHD, and examine the relation between experimental risk taking and history of real-life risk-taking behavior, defined as cigarette, alcohol, and street drug use. University students with and without ADHD completed a modified version of the Cambridge Gambling Test, in which they had to choose between alternatives varied by level of risk, and reported their history of substance use. Both groups showed similar patterns of risk taking on the experimental decision making task, suggesting that ADHD is not linked to low sensitivity to risk. Past and present substance use was more prevalent in adults with ADHD. These finding question the validity of experimental probabilistic decision making task as a valid model for ADHD-related risk-taking behavior. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 5: Auxiliary shuttle risk analyses

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.

  14. Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.

    PubMed

    Avramenko, M; Bolyatko, V; Kosterev, V

    2005-01-01

    Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.

  15. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  16. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  18. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  19. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  20. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less

  1. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  2. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  3. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  4. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  5. WIPCast: Probabilistic Forecasting for Aviation Decision Aid Applications

    DTIC Science & Technology

    2011-06-01

    traders, or families planning an outing – manage weather-related risk. By quantifying risk , probabilistic forecasting enables optimization of actions via...confidence interval to the user’s risk tolerance helps drive highly effective and innovative decision support mechanisms for visually quantifying risk for

  6. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  7. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  8. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  10. Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions

    EPA Pesticide Factsheets

    General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.

  11. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    NASA Technical Reports Server (NTRS)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  12. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  13. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  14. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  15. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  16. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  17. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. The Experimental Breeder Reactor II seismic probabilistic risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roglans, J; Hill, D J

    1994-02-01

    The Experimental Breeder Reactor II (EBR-II) is a US Department of Energy (DOE) Category A research reactor located at Argonne National Laboratory (ANL)-West in Idaho. EBR-II is a 62.5 MW-thermal Liquid Metal Reactor (LMR) that started operation in 1964 and it is currently being used as a testbed in the Integral Fast Reactor (IFR) Program. ANL has completed a Level 1 Probabilistic Risk Assessment (PRA) for EBR-II. The Level 1 PRA for internal events and most external events was completed in June 1991. The seismic PRA for EBR-H has recently been completed. The EBR-II reactor building contains the reactor, themore » primary system, and the decay heat removal systems. The reactor vessel, which contains the core, and the primary system, consisting of two primary pumps and an intermediate heat exchanger, are immersed in the sodium-filled primary tank, which is suspended by six hangers from a beam support structure. Three systems or functions in EBR-II were identified as the most significant from the standpoint of risk of seismic-induced fuel damage: (1) the reactor shutdown system, (2) the structural integrity of the passive decay heat removal systems, and (3) the integrity of major structures, like the primary tank containing the reactor that could threaten both the reactivity control and decay heat removal functions. As part of the seismic PRA, efforts were concentrated in studying these three functions or systems. The passive safety response of EBR-II reactor -- both passive reactivity shutdown and passive decay heat removal, demonstrated in a series of tests in 1986 -- was explicitly accounted for in the seismic PRA as it had been included in the internal events assessment.« less

  19. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  20. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  1. Extravehicular Activity Probabilistic Risk Assessment Overview for Thermal Protection System Repair on the Hubble Space Telescope Servicing Mission

    NASA Technical Reports Server (NTRS)

    Bigler, Mark; Canga, Michael A.; Duncan, Gary

    2010-01-01

    The Shuttle Program initiated an Extravehicular Activity (EVA) Probabilistic Risk Assessment (PRA) to assess the risks associated with performing a Shuttle Thermal Protection System (TPS) repair during the Space Transportation System (STS)-125 Hubble repair mission as part of risk trades between TPS repair and crew rescue.

  2. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  3. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  4. Risk assessment for produced water discharges to Louisiana open bays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meinhold, A.F.; Holtzman, S.; DePhillips, M.P.

    1995-11-01

    Potential human health and environmental impacts from discharge of produced water to the Gulf of Mexico concern regulators at the State and Federal levels, environmental interest groups, industry and the public. Current regulations in the United States require or propose azero discharge limit for coastal facilities based primarily on studies performed in low energy,poorly flushed environments. Produced water discharges in coastal Louisiana, however,include a number located in open bays, where potential and impacts are likely to be larger than the minimal impacts associated with offshore discharges, but smaller than those demonstrated in low-energy canal environments. This paper summarizes results ofmore » a conservative screening-level health and ecological assessment for contaminants discharged in produced water to open bays in Louisiana, and reports results of a probabilistic human health risk assessment for radium and lead. The initial human health and ecological risk assessments consisted of conservative screening analyses that identified potentially important contaminants and excluded others from further consideration. A more quantitative probabilistic risk assessment was completed for the human health effects of the two contaminants identified in this screen: radium and lead. This work is part of a series of studies on the health and ecological risks from discharges of produced water to the Gulf of Mexico, supported by the United States Department of Energy (USDOE).« less

  5. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  6. 76 FR 55717 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... Subcommittee on Reliability and Probabilistic Risk Assessment The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on September 20, 2011, Room T-2B1, 11545 Rockville Pike... Memorandum on Modifying the Risk-Informed Regulatory Guidance for New Reactors. The Subcommittee will hear...

  7. 75 FR 18205 - Notice of Peer Review Meeting for the External Peer Review Drafts of Two Documents on Using...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...

  8. 76 FR 28102 - Notice of Issuance of Regulatory Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-13

    ..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...

  9. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  10. Finite element probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacvarov, D.C.

    1981-01-01

    A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less

  11. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  12. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  13. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  14. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  15. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  16. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  17. A mediation model to explain decision making under conditions of risk among adolescents: the role of fluid intelligence and probabilistic reasoning.

    PubMed

    Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina

    2014-01-01

    This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.

  18. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  19. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  20. [Theoretical evaluation of the risk of decompression illness during simulated extravehicular activity].

    PubMed

    Nikolaev, V P

    2008-01-01

    Theoretical analysis of the risk of decompression illness (DI) during extravehicular activity following the Russian and NASA decompression protocols (D-R and D-US, respectively) was performed. In contrast to the tradition approach to decompression stress evaluation by the factor of tissue supersaturation with nitrogen, our probabilistic theory of decompression safety provides a completely reasoned evaluation and comparison of the levels of hazard of these decompression protocols. According to this theory, the function of cumulative DI risk is equal to the sum of functions of cumulative risk of lesion of all body tissues by gas bubbles and their supersaturation by solute gases. Based on modeling of dynamics of these functions, growth of the DI cumulative risk in the course of D-R and D-US follows essentially similar trajectories within the time-frame of up to 330 minutes. However, further extension of D-US but not D-R raises the risk of DI drastically.

  1. A probabilistic approach to radiative energy loss calculations for optically thick atmospheres - Hydrogen lines and continua

    NASA Technical Reports Server (NTRS)

    Canfield, R. C.; Ricchiazzi, P. J.

    1980-01-01

    An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.

  2. Probabilistic risk analysis of mercury intake via food consumption in Spain.

    PubMed

    Moreno-Ortega, Alicia; Moreno-Rojas, Rafael; Martínez-Álvarez, Jesús Román; González Estecha, Montserrat; Castro González, Numa Pompilio; Amaro López, Manuel Ángel

    2017-09-01

    In Spain, recently, the public institutions have given information to the population in relation to fish consumption and the risk that it poses to health from the ingestion of mercury supposedly contained in the fish. At the same time, several scientific societies have published various works in this direction. All this without there being, up to now, any study on the evaluation of a probabilistic risk from mercury due to fish and seafood intake in Spain, which is the objective of this present work. For that purpose, we took individual data from a survey of the total diet of 3000 people, whose consumption of the principal fish and seafood species (49) was estimated. We compiled individualized data (2000) on the total mercury content of those species, which were completed and validated with bibliographic statistical data. After estimating the distributions of each fish and seafood species, both of their consumption and their mercury content, a simulation was made of the distribution of mercury ingestion from fish and seafood offered by 2.6% of the Spanish population at risk of exceeding total mercury recommendations, and between 12.2% and 21.2% of those exceeding methylmercury ones. The main species responsible were tuna fish, swordfish and hake, and significant differences were identified in fish consumption between sexes and ages, although, in the risk percentage, what stands out is an increase in the latter with an increase in age. Copyright © 2017 Elsevier GmbH. All rights reserved.

  3. Application of Probabilistic Methods to Assess Risk Due to Resonance in the Design of J-2X Rocket Engine Turbine Blades

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; DeHaye, Michael; DeLessio, Steven

    2011-01-01

    The LOX-Hydrogen J-2X Rocket Engine, which is proposed for use as an upper-stage engine for numerous earth-to-orbit and heavy lift launch vehicle architectures, is presently in the design phase and will move shortly to the initial development test phase. Analysis of the design has revealed numerous potential resonance issues with hardware in the turbomachinery turbine-side flow-path. The analysis of the fuel pump turbine blades requires particular care because resonant failure of the blades, which are rotating in excess of 30,000 revolutions/minutes (RPM), could be catastrophic for the engine and the entire launch vehicle. This paper describes a series of probabilistic analyses performed to assess the risk of failure of the turbine blades due to resonant vibration during past and present test series. Some significant results are that the probability of failure during a single complete engine hot-fire test is low (1%) because of the small likelihood of resonance, but that the probability increases to around 30% for a more focused turbomachinery-only test because all speeds will be ramped through and there is a greater likelihood of dwelling at more speeds. These risk calculations have been invaluable for use by program management in deciding if risk-reduction methods such as dampers are necessary immediately or if the test can be performed before the risk-reduction hardware is ready.

  4. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  5. 76 FR 11525 - Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA), Room T-2B1, 11545 Rockville Pike, Rockville, Maryland...

  6. 76 FR 22934 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on May 11, 2011, Room T-2B3, 11545...

  7. 76 FR 71609 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-18

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on December 14, 2011, Room T-2B3...

  8. Adolescents' Heightened Risk-Seeking in a Probabilistic Gambling Task

    ERIC Educational Resources Information Center

    Burnett, Stephanie; Bault, Nadege; Coricelli, Giorgio; Blakemore, Sarah-Jayne

    2010-01-01

    This study investigated adolescent males' decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (aged 12-15), 20 mid-adolescents (aged 15-18) and 17…

  9. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  10. Probabilistic Description of the Hydrologic Risk in Agriculture

    NASA Astrophysics Data System (ADS)

    Vico, G.; Porporato, A. M.

    2011-12-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climatic variability on agroecosystems productivity and profitability, at the expenses of increasing water requirements for irrigation purposes. Optimizing water allocation for crop yield preservation and sustainable development needs to account for hydro-climatic variability, which is by far the main source of uncertainty affecting crop yields and irrigation water requirements. In this contribution, a widely applicable probabilistic framework is proposed to quantitatively define the hydrologic risk of yield reduction for both rainfed and irrigated agriculture. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season. Based on these linkages, long-term and real-time yield reduction risk indices are defined as a function of climate, soil and crop parameters, as well as irrigation strategy. The former risk index is suitable for long-term irrigation strategy assessment and investment planning, while the latter risk index provides a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season. This probabilistic framework allows also assessing the impact of limited water availability on crop yield, thus guiding the optimal allocation of water resources for human and environmental needs. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios, thus facilitating the assessment of the impact of increasingly frequent water shortages on agricultural productivity, profitability, and sustainability.

  11. The benefits of probabilistic exposure assessment: three case studies involving contaminated air, water, and soil.

    PubMed

    Finley, B; Paustenbach, D

    1994-02-01

    Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.

  12. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the relevance of accounting for the full range of flood events and their relation to both potential damages and benefits in risk assessments. Management measures may thus be designed to reflect local contexts and support benefits of natural hydrologic processes, while minimizing flood damage.

  13. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  14. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  15. 76 FR 18586 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on April 20, 2011, Room T-2B1, 11545...

  16. Lossed in translation: an off-the-shelf method to recover probabilistic beliefs from loss-averse agents.

    PubMed

    Offerman, Theo; Palley, Asa B

    2016-01-01

    Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  19. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  20. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  1. Diffusion tensor tractography of the arcuate fasciculus in patients with brain tumors: Comparison between deterministic and probabilistic models

    PubMed Central

    Li, Zhixi; Peck, Kyung K.; Brennan, Nicole P.; Jenabi, Mehrnaz; Hsu, Meier; Zhang, Zhigang; Holodny, Andrei I.; Young, Robert J.

    2014-01-01

    Purpose The purpose of this study was to compare the deterministic and probabilistic tracking methods of diffusion tensor white matter fiber tractography in patients with brain tumors. Materials and Methods We identified 29 patients with left brain tumors <2 cm from the arcuate fasciculus who underwent pre-operative language fMRI and DTI. The arcuate fasciculus was reconstructed using a deterministic Fiber Assignment by Continuous Tracking (FACT) algorithm and a probabilistic method based on an extended Monte Carlo Random Walk algorithm. Tracking was controlled using two ROIs corresponding to Broca’s and Wernicke’s areas. Tracts in tumoraffected hemispheres were examined for extension between Broca’s and Wernicke’s areas, anterior-posterior length and volume, and compared with the normal contralateral tracts. Results Probabilistic tracts displayed more complete anterior extension to Broca’s area than did FACT tracts on the tumor-affected and normal sides (p < 0.0001). The median length ratio for tumor: normal sides was greater for probabilistic tracts than FACT tracts (p < 0.0001). The median tract volume ratio for tumor: normal sides was also greater for probabilistic tracts than FACT tracts (p = 0.01). Conclusion Probabilistic tractography reconstructs the arcuate fasciculus more completely and performs better through areas of tumor and/or edema. The FACT algorithm tends to underestimate the anterior-most fibers of the arcuate fasciculus, which are crossed by primary motor fibers. PMID:25328583

  2. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  3. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  4. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    NASA Astrophysics Data System (ADS)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  5. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  6. Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires

    Treesearch

    Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes

    2010-01-01

    Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...

  7. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  8. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  9. Evidence-based risk communication: a systematic review.

    PubMed

    Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A

    2014-08-19

    Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.

  10. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  11. Cost-Risk Trade-off of Solar Radiation Management and Mitigation under Probabilistic Information on Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann

    2017-04-01

    In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would change that picture.

  12. Quantum structure in economics: The Ellsberg paradox

    NASA Astrophysics Data System (ADS)

    Aerts, Diederik; Sozzo, Sandro

    2012-03-01

    The expected utility hypothesis and Savage's Sure-Thing Principle are violated in real life decisions, as shown by the Allais and Ellsberg paradoxes. The popular explanation in terms of ambiguity aversion is not completely accepted. As a consequence, uncertainty is still problematical in economics. To overcome these difficulties a distinction between risk and ambiguity has been introduced which depends on the existence of a Kolmogorovian probabilistic structure modeling these uncertainties. On the other hand, evidence of everyday life suggests that context plays a fundamental role in human decisions under uncertainty. Moreover, it is well known from physics that any probabilistic structure modeling contextual interactions between entities structurally needs a non-Kolmogorovian framework admitting a quantum-like representation. For this reason, we have recently introduced a notion of contextual risk to mathematically capture situations in which ambiguity occurs. We prove in this paper that the contextual risk approach can be applied to the Ellsberg paradox, and elaborate a sphere model within our hidden measurement formalism which reveals that it is the overall conceptual landscape that is responsible of the disagreement between actual human decisions and the predictions of expected utility theory, which generates the paradox. This result points to the presence of a quantum conceptual layer in human thought which is superposed to the usually assumed classical logical layer, and conceptually supports the thesis of several authors suggesting the presence of quantum structure in economics and decision theory.

  13. Costing the satellite power system

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1978-01-01

    The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.

  14. Comparative assessment of analytical approaches to quantify the risk for introduction of rare animal diseases: the example of avian influenza in Spain.

    PubMed

    Sánchez-Vizcaíno, Fernando; Perez, Andrés; Martínez-López, Beatriz; Sánchez-Vizcaíno, José Manuel

    2012-08-01

    Trade of animals and animal products imposes an uncertain and variable risk for exotic animal diseases introduction into importing countries. Risk analysis provides importing countries with an objective, transparent, and internationally accepted method for assessing that risk. Over the last decades, European Union countries have conducted probabilistic risk assessments quite frequently to quantify the risk for rare animal diseases introduction into their territories. Most probabilistic animal health risk assessments have been typically classified into one-level and multilevel binomial models. One-level models are more simple than multilevel models because they assume that animals or products originate from one single population. However, it is unknown whether such simplification may result in substantially different results compared to those obtained through the use of multilevel models. Here, data used on a probabilistic multilevel binomial model formulated to assess the risk for highly pathogenic avian influenza introduction into Spain were reanalyzed using a one-level binomial model and their outcomes were compared. An alternative ordinal model is also proposed here, which makes use of simpler assumptions and less information compared to those required by traditional one-level and multilevel approaches. Results suggest that, at least under certain circumstances, results of the one-level and ordinal approaches are similar to those obtained using multilevel models. Consequently, we argue that, when data are insufficient to run traditional probabilistic models, the ordinal approach presented here may be a suitable alternative to rank exporting countries in terms of the risk that they impose for the spread of rare animal diseases into disease-free countries. © 2012 Society for Risk Analysis.

  15. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.

  17. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  18. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  19. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  20. Alcohol Increases Delay and Probability Discounting of Condom-Protected Sex: A Novel Vector for Alcohol-Related HIV Transmission.

    PubMed

    Johnson, Patrick S; Sweeney, Mary M; Herrmann, Evan S; Johnson, Matthew W

    2016-06-01

    Alcohol use, especially at binge levels, is associated with sexual HIV risk behavior, but the mechanisms through which alcohol increases sexual risk taking are not well-examined. Delay discounting, that is, devaluation of future consequences as a function of delay to their occurrence, has been implicated in a variety of problem behaviors, including risky sexual behavior. Probability discounting is studied with a similar framework as delay discounting, but is a distinct process in which a consequence is devalued because it is uncertain or probabilistic. Twenty-three, nondependent alcohol users (13 male, 10 female; mean age = 25.3 years old) orally consumed alcohol (1 g/kg) or placebo in 2 separate experimental sessions. During sessions, participants completed tasks examining delay and probability discounting of hypothetical condom-protected sex (Sexual Delay Discounting Task, Sexual Probability Discounting Task) and of hypothetical and real money. Alcohol decreased the likelihood that participants would wait to have condom-protected sex versus having immediate, unprotected sex. Alcohol also decreased the likelihood that participants would use an immediately available condom given a specified level of sexually transmitted infection (STI) risk. Alcohol did not affect delay discounting of money, but it did increase participants' preferences for larger, probabilistic monetary rewards over smaller, certain rewards. Acute, binge-level alcohol intoxication may increase sexual HIV risk by decreasing willingness to delay sex in order to acquire a condom in situations where one is not immediately available, and by decreasing sensitivity to perceived risk of STI contraction. These findings suggest that delay and probability discounting are critical, but heretofore unrecognized, processes that may mediate the relations between alcohol use and HIV risk. Copyright © 2016 by the Research Society on Alcoholism.

  1. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  2. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  3. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  4. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.

  5. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  6. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  7. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  8. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  9. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  10. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.

  11. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  12. A probabilistic asteroid impact risk model: assessment of sub-300 m impacts

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2017-06-01

    A comprehensive asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain input parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions for objects up to 300 m in diameter. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data have little effect on the metrics of interest.

  13. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  14. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  15. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  16. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  17. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  18. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  19. Probabilistic Assessment of Cancer Risk for Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2009-01-01

    During future lunar missions, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon transit. NASA s new lunar program anticipates that up to 15% of crew time may be on EVA, with minimal radiation shielding. For the operational challenge to respond to events of unknown size and duration, a probabilistic risk assessment approach is essential for mission planning and design. Using the historical database of proton measurements during the past 5 solar cycles, a typical hazard function for SPE occurrence was defined using a non-homogeneous Poisson model as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions ranging from the 5th to 95th percentile of particle fluences for a specified mission period were simulated. Organ doses corresponding to particle fluences at the median and at the 95th percentile for a specified mission period were assessed using NASA s baryon transport model, BRYNTRN. The cancer fatality risk for astronauts as functions of age, gender, and solar cycle activity were then analyzed. The probability of exceeding the NASA 30- day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated. Future work will involve using this probabilistic risk assessment approach to SPE forecasting, combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  20. [Self-esteem level and correlation to risk behaviour of students at the University of Almería (Spain)].

    PubMed

    Muñoz-París, M José; Ruiz-Muñoz, Ana del Mar

    2008-01-01

    To evaluate self-esteem levels in college students at the University of Almería (Spain) and their possible correlation with risk behaviors, specifically, drug use and sexual behavior. We performed an observational, descriptive, cross-sectional, prolective study. A self-completed questionnaire was used to gather data. Students attending specific university services of the University of Almería were selected by non-probabilistic sampling. Self-esteem was measured using Cooersmith's scale. In the 123 students studied, self-esteem was very low in 7.9%, medium-low in 29.3%, medium in 12.2 %, medium-high in 46.3% and very high in 4.9 %. No significant differences were found between the sexes. No significant correlation was found between sexual behavior and level of self-esteem. Consumption of alcohol, cannabis, cocaine, designer drugs, and amphetamines was higher in groups with higher self-esteem. Self-esteem is important in every sphere of life and can be considered a basic human need. Self-esteem increases the level of personal security and has been described as a protective factor against risk behaviors. However, our data indicate increased drug consumption among young people with higher self-esteem. Given the importance of the topic and the novelty of our results, in future studies we intend to broaden the sample and perform probabilistic stratified sampling in order to extrapolate the results to the entire population of the University of Almería.

  1. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  2. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  3. Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.

    PubMed

    Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente

    2009-12-20

    This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.

  4. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  5. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  6. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  7. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  8. A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.

  9. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  10. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  11. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments.

    PubMed

    Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A

    2017-11-06

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.

  12. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.

    2016-12-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  13. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  14. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.

    2017-12-01

    The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  15. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn

    2017-04-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  16. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  17. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  18. A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites

    NASA Astrophysics Data System (ADS)

    Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.

    2010-12-01

    Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.

  19. Risk assessment for furan contamination through the food chain in Belgian children.

    PubMed

    Scholl, Georges; Huybrechts, Inge; Humblet, Marie-France; Scippo, Marie-Louise; De Pauw, Edwin; Eppe, Gauthier; Saegerman, Claude

    2012-08-01

    Young, old, pregnant and immuno-compromised persons are of great concern for risk assessors as they represent the sub-populations most at risk. The present paper focuses on risk assessment linked to furan exposure in children. Only the Belgian population was considered because individual contamination and consumption data that are required for accurate risk assessment were available for Belgian children only. Two risk assessment approaches, the so-called deterministic and probabilistic, were applied and the results were compared for the estimation of daily intake. A significant difference between the average Estimated Daily Intake (EDI) was underlined between the deterministic (419 ng kg⁻¹ body weight (bw) day⁻¹) and the probabilistic (583 ng kg⁻¹ bw day⁻¹) approaches, which results from the mathematical treatment of the null consumption and contamination data. The risk was characterised by two ways: (1) the classical approach by comparison of the EDI to a reference dose (RfD(chronic-oral)) and (2) the most recent approach, namely the Margin of Exposure (MoE) approach. Both reached similar conclusions: the risk level is not of a major concern, but is neither negligible. In the first approach, only 2.7 or 6.6% (respectively in the deterministic and in the probabilistic way) of the studied population presented an EDI above the RfD(chronic-oral). In the second approach, the percentage of children displaying a MoE above 10,000 and below 100 is 3-0% and 20-0.01% in the deterministic and probabilistic modes, respectively. In addition, children were compared to adults and significant differences between the contamination patterns were highlighted. While major contamination was linked to coffee consumption in adults (55%), no item predominantly contributed to the contamination in children. The most important were soups (19%), dairy products (17%), pasta and rice (11%), fruit and potatoes (9% each).

  20. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  1. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  2. Relationship between Adolescent Risk Preferences on a Laboratory Task and Behavioral Measures of Risk-taking

    PubMed Central

    Rao, Uma; Sidhartha, Tanuj; Harker, Karen R.; Bidesi, Anup S.; Chen, Li-Ann; Ernst, Monique

    2010-01-01

    Purpose The goal of the study was to assess individual differences in risk-taking behavior among adolescents in the laboratory. A second aim was to evaluate whether the laboratory-based risk-taking behavior is associated with other behavioral and psychological measures associated with risk-taking behavior. Methods Eighty-two adolescents with no personal history of psychiatric disorder completed a computerized decision-making task, the Wheel of Fortune (WOF). By offering choices between clearly defined probabilities and real monetary outcomes, this task assesses risk preferences when participants are confronted with potential rewards and losses. The participants also completed a variety of behavioral and psychological measures associated with risk-taking behavior. Results Performance on the task varied based on the probability and anticipated outcomes. In the winning sub-task, participants selected low probability-high magnitude reward (high-risk choice) less frequently than high probability-low magnitude reward (low-risk choice). In the losing sub-task, participants selected low probability-high magnitude loss more often than high probability-low magnitude loss. On average, the selection of probabilistic rewards was optimal and similar to performance in adults. There were, however, individual differences in performance, and one-third of the adolescents made high-risk choice more frequently than low-risk choice while selecting a reward. After controlling for sociodemographic and psychological variables, high-risk choice on the winning task predicted “real-world” risk-taking behavior and substance-related problems. Conclusions These findings highlight individual differences in risk-taking behavior. Preliminary data on face validity of the WOF task suggest that it might be a valuable laboratory tool for studying behavioral and neurobiological processes associated with risk-taking behavior in adolescents. PMID:21257113

  3. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  4. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  5. On the Measurement and Properties of Ambiguity in Probabilistic Expectations

    ERIC Educational Resources Information Center

    Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn

    2015-01-01

    Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…

  6. Reliability, Risk and Cost Trade-Offs for Composite Designs

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1996-01-01

    Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.

  7. Risk Assessment Guidance for Superfund (RAGS) Volume III: Part A

    EPA Pesticide Factsheets

    EPA's Risk Assessment Guidance for Superfund (RAGS) Volume 3A provides policies and guiding principles on the application of probabilistic risk assessment (PRA) methods to human health and ecological risk assessment in the EPA Superfund Program.

  8. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  9. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  10. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  11. Using Probabilistic Methods in Water Scarcity Assessments: A First Step Towards a Water Scarcity Risk Assessment Framework

    NASA Technical Reports Server (NTRS)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Phillip

    2016-01-01

    Water scarcity -driven by climate change, climate variability, and socioeconomic developments- is recognized as one of the most important global risks, both in terms of likelihood and impact. Whilst a wide range of studies have assessed the role of long term climate change and socioeconomic trends on global water scarcity, the impact of variability is less well understood. Moreover, the interactions between different forcing mechanisms, and their combined effect on changes in water scarcity conditions, are often neglected. Therefore, we provide a first step towards a framework for global water scarcity risk assessments, applying probabilistic methods to estimate water scarcity risks for different return periods under current and future conditions while using multiple climate and socioeconomic scenarios.

  12. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  13. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  14. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  15. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  16. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  17. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  18. A probabilistic QMRA of Salmonella in direct agricultural reuse of treated municipal wastewater.

    PubMed

    Amha, Yamrot M; Kumaraswamy, Rajkumari; Ahmad, Farrukh

    2015-01-01

    Developing reliable quantitative microbial risk assessment (QMRA) procedures aids in setting recommendations on reuse applications of treated wastewater. In this study, a probabilistic QMRA to determine the risk of Salmonella infections resulting from the consumption of edible crops irrigated with treated wastewater was conducted. Quantitative polymerase chain reaction (qPCR) was used to enumerate Salmonella spp. in post-disinfected samples, where they showed concentrations ranging from 90 to 1,600 cells/100 mL. The results were used to construct probabilistic exposure models for the raw consumption of three vegetables (lettuce, cabbage, and cucumber) irrigated with treated wastewater, and to estimate the disease burden using Monte Carlo analysis. The results showed elevated median disease burden, when compared with acceptable disease burden set by the World Health Organization, which is 10⁻⁶ disability-adjusted life years per person per year. Of the three vegetables considered, lettuce showed the highest risk of infection in all scenarios considered, while cucumber showed the lowest risk. The results of the Salmonella concentration obtained with qPCR were compared with the results of Escherichia coli concentration for samples taken on the same sampling dates.

  19. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments

    PubMed Central

    Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.

    2018-01-01

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804

  20. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  1. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  2. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  3. An Example of Risk Informed Design

    NASA Technical Reports Server (NTRS)

    Banke, Rick; Grant, Warren; Wilson, Paul

    2014-01-01

    NASA Engineering requested a Probabilistic Risk Assessment (PRA) to compare the difference in the risk of Loss of Crew (LOC) and Loss of Mission (LOM) between different designs of a fluid assembly. They were concerned that the configuration favored by the design team was more susceptible to leakage than a second proposed design, but realized that a quantitative analysis to compare the risks between the two designs might strengthen their argument. The analysis showed that while the second design did help improve the probability of LOC, it did not help from a probability of LOM perspective. This drove the analysis team to propose a minor design change that would drive the probability of LOM down considerably. The analysis also demonstrated that there was another major risk driver that was not immediately obvious from a typical engineering study of the design and was therefore unexpected. None of the proposed alternatives were addressing this risk. This type of trade study demonstrates the importance of performing a PRA in order to completely understand a system's design. It allows managers to use risk as another one of the commodities (e.g., mass, cost, schedule, fault tolerance) that can be traded early in the design of a new system.

  4. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    PubMed

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  5. A generalized sizing method for revolutionary concepts under probabilistic design constraints

    NASA Astrophysics Data System (ADS)

    Nam, Taewoo

    Internal combustion (IC) engines that consume hydrocarbon fuels have dominated the propulsion systems of air-vehicles for the first century of aviation. In recent years, however, growing concern over rapid climate changes and national energy security has galvanized the aerospace community into delving into new alternatives that could challenge the dominance of the IC engine. Nevertheless, traditional aircraft sizing methods have significant shortcomings for the design of such unconventionally powered aircraft. First, the methods are specialized for aircraft powered by IC engines, and thus are not flexible enough to assess revolutionary propulsion concepts that produce propulsive thrust through a completely different energy conversion process. Another deficiency associated with the traditional methods is that a user of these methods must rely heavily on experts' experience and advice for determining appropriate design margins. However, the introduction of revolutionary propulsion systems and energy sources is very likely to entail an unconventional aircraft configuration, which inexorably disqualifies the conjecture of such "connoisseurs" as a means of risk management. Motivated by such deficiencies, this dissertation aims at advancing two aspects of aircraft sizing: (1) to develop a generalized aircraft sizing formulation applicable to a wide range of unconventionally powered aircraft concepts and (2) to formulate a probabilistic optimization technique that is able to quantify appropriate design margins that are tailored towards the level of risk deemed acceptable to a decision maker. A more generalized aircraft sizing formulation, named the Architecture Independent Aircraft Sizing Method (AIASM), was developed for sizing revolutionary aircraft powered by alternative energy sources by modifying several assumptions of the traditional aircraft sizing method. Along with advances in deterministic aircraft sizing, a non-deterministic sizing technique, named the Probabilistic Aircraft Sizing Method (PASM), was developed. The method allows one to quantify adequate design margins to account for the various sources of uncertainty via the application of the chance-constrained programming (CCP) strategy to AIASM. In this way, PASM can also provide insights into a good compromise between cost and safety.

  6. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  7. A probabilistic topic model for clinical risk stratification from electronic health records.

    PubMed

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that incorporating risk scoring knowledge as prior information can improve the performance in risk stratification. Experimental results reveal that our models achieve competitive performance in risk stratification in comparison with existing supervised approaches. In addition, the unsupervised nature of our models makes them highly portable to the risk stratification tasks of various diseases. Moreover, patient sub-profiles and sub-profile-specific risk tiers generated by our models are coherent and informative, and provide significant potential to be explored for the further tasks, such as patient cohort analysis. We hypothesize that the proposed framework can readily meet the demand for risk stratification from a large volume of EHRs in an open-ended fashion. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Probabilistic confidence for decisions based on uncertain reliability estimates

    NASA Astrophysics Data System (ADS)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  9. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  10. Grid Inertial Response-Based Probabilistic Determination of Energy Storage System Capacity Under High Solar Penetration

    DOE PAGES

    Yue, Meng; Wang, Xiaoyu

    2015-07-01

    It is well-known that responsive battery energy storage systems (BESSs) are an effective means to improve the grid inertial response to various disturbances including the variability of the renewable generation. One of the major issues associated with its implementation is the difficulty in determining the required BESS capacity mainly due to the large amount of inherent uncertainties that cannot be accounted for deterministically. In this study, a probabilistic approach is proposed to properly size the BESS from the perspective of the system inertial response, as an application of probabilistic risk assessment (PRA). The proposed approach enables a risk-informed decision-making processmore » regarding (1) the acceptable level of solar penetration in a given system and (2) the desired BESS capacity (and minimum cost) to achieve an acceptable grid inertial response with a certain confidence level.« less

  11. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given severities) and vulnerability (the probability of a limit state performance be reached, given a certain severity). Then, for each landslide all the exposed goods (structures and infrastructures) within the landslide area and within a buffer (representative of the maximum extension of a landslide given a reactivation), are counted. The risk is the product of the damage probability and the ratio of the exposed goods of each landslide to the whole assets exposed to the same type of landslides. Since the risk is computed numerically and by the same procedure applied to all landslides, it is free from any subjective assessment such as those implied in the qualitative methods.

  12. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  13. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  14. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  15. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  16. Probabilistic DHP adaptive critic for nonlinear stochastic control systems.

    PubMed

    Herzallah, Randa

    2013-06-01

    Following the recently developed algorithms for fully probabilistic control design for general dynamic stochastic systems (Herzallah & Káarnáy, 2011; Kárný, 1996), this paper presents the solution to the probabilistic dual heuristic programming (DHP) adaptive critic method (Herzallah & Káarnáy, 2011) and randomized control algorithm for stochastic nonlinear dynamical systems. The purpose of the randomized control input design is to make the joint probability density function of the closed loop system as close as possible to a predetermined ideal joint probability density function. This paper completes the previous work (Herzallah & Káarnáy, 2011; Kárný, 1996) by formulating and solving the fully probabilistic control design problem on the more general case of nonlinear stochastic discrete time systems. A simulated example is used to demonstrate the use of the algorithm and encouraging results have been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds

    EPA Science Inventory

    When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...

  18. How Can You Support RIDM/CRM/RM Through the Use of PRA

    NASA Technical Reports Server (NTRS)

    DoVemto. Tpmu

    2011-01-01

    Probabilistic Risk Assessment (PRA) is one of key Risk Informed Decision Making (RIDM) tools. It is a scenario-based methodology aimed at identifying and assessing Safety and Technical Performance risks in complex technological systems.

  19. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  20. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  1. Probalistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Xapsos, Michael

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element

  2. Probabilistic eruption forecasting at short and long time scales

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  3. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less

  5. NSTS Orbiter auxiliary power unit turbine wheel cracking risk assessment

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Mcclung, R. C.; Torng, T. Y.

    1992-01-01

    The present investigation of turbine-wheel cracking problems in the hydrazine-fueled APU turbine wheel of the Space Shuttle Orbiter's Main Engines has indicated the efficacy of systematic probabilistic risk assessment in flight certification and safety resolution. Nevertheless, real crack-initiation and propagation problems do not lend themselves to purely analytical studies. The high-cycle fatigue problem is noted to generally be unsuited to probabilistic modeling, due to its extremely high degree of intrinsic scatter. In the case treated, the cracks appear to trend toward crack arrest in a low cycle fatigue mode, due to a detuning of the resonance model.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  7. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2008-06-01

    The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space). These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras). The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  8. CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.

    2006-01-01

    This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.

  9. III: Use of biomarkers as Risk Indicators in Environmental Risk Assessment of oil based discharges offshore.

    PubMed

    Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M

    2017-06-01

    Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    PubMed

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  12. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  13. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  14. Asteroid Impact Risk: Ground Hazard versus Impactor Size

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan; Wheeler, Lorien; Dotson, Jessie; Aftosmis, Michael; Tarano, Ana

    2017-01-01

    We utilized a probabilistic asteroid impact risk (PAIR) model to stochastically assess the impact risk due to an ensemble population of Near-Earth Objects (NEOs). Concretely, we present the variation of risk with impactor size. Results suggest that large impactors dominate the average risk, even when only considering the subset of undiscovered NEOs.

  15. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.

  16. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  17. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  18. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  19. Probabilistic health risk assessment for arsenic intake through drinking groundwater in Taiwan's Pingtung Plain

    NASA Astrophysics Data System (ADS)

    Liang, C. P.; Chen, J. S.

    2017-12-01

    An abundant and inexpensive supply of groundwater is used to meet drinking, agriculture and aquaculture requirements of the residents in the Pingtung Plain. Long-term groundwater quality monitoring data indicate that the As content in groundwater in the Pingtung Plain exceeds the maximum level of 10 g/L recommended by the World Health Organization (WHO). The situation is further complicated by the fact that only 46.89% of population in the Pingtung Plain has been served with tap water, far below the national average of 92.93%. Considering there is a considerable variation in the measured concentrations, from below the detection limit (<0.1 g/L) to the maximum value of 544 g/L and the consumption rate and body weight of the individual, the conventional approach to conducting a human health risk assessment may be insufficient for health risk management. This study presents a probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater by local residents in the Pingtung Plain. The probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater is achieved using Monte Carlo simulation technique based on the hazard quotient (HQ) and target cancer risk (TR) established by the U.S. Environmental Protection Agency. This study demonstrates the importance of the individual variability of inorganic As intake through drinking groundwater consumption when evaluating a high exposure sub-group of the population who drink high As content groundwater.

  20. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  1. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  2. Are engineered nano iron oxide particles safe? an environmental risk assessment by probabilistic exposure, effects and risk modeling.

    PubMed

    Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd

    2016-12-01

    Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.

  3. A further assessment of decision-making in anorexia nervosa.

    PubMed

    Adoue, C; Jaussent, I; Olié, E; Beziat, S; Van den Eynde, F; Courtet, P; Guillaume, S

    2015-01-01

    Anorexia nervosa (AN) may be associated with impaired decision-making. Cognitive processes underlying this impairment remain unclear, mainly because previous assessments of this complex cognitive function were completed with a single test. Furthermore, clinical features such as mood status may impact this association. We aim to further explore the hypothesis of altered decision-making in AN. Sixty-three adult women with AN and 49 female controls completed a clinical assessment and were assessed by three tasks related to decision-making [Iowa Gambling Task (IGT), Balloon Analogue Risk Task (BART), Probabilistic Reversal Learning Task (PRLT)]. People with AN had poorer performance on the IGT and made less risky choices on the BART, whereas performances were not different on PRLT. Notably, AN patients with a current major depressive disorder showed similar performance to those with no current major depressive disorder. These results tend to confirm an impaired decision making-process in people with AN and suggest that various cognitive processes such as inhibition to risk-taking or intolerance of uncertainty may underlie this condition Furthermore, these impairments seem unrelated to the potential co-occurent major depressive disorders. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  4. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  5. Distribution and health risk assessment of trace metals in freshwater tilapia from three different aquaculture sites in Jelebu Region (Malaysia).

    PubMed

    Low, Kah Hin; Zain, Sharifuddin Md; Abas, Mhd Radzi; Md Salleh, Kaharudin; Teo, Yin Yin

    2015-06-15

    The trace metal concentrations in edible muscle of red tilapia (Oreochromis spp.) sampled from a former tin mining pool, concrete tank and earthen pond in Jelebu were analysed with microwave assisted digestion-inductively coupled plasma-mass spectrometry. Results were compared with established legal limits and the daily ingestion exposures simulated using the Monte Carlo algorithm for potential health risks. Among the metals investigated, arsenic was found to be the key contaminant, which may have arisen from the use of formulated feeding pellets. Although the risks of toxicity associated with consumption of red tilapia from the sites investigated were found to be within the tolerable range, the preliminary probabilistic estimation of As cancer risk shows that the 95th percentile risk level surpassed the benchmark level of 10(-5). In general, the probabilistic health risks associated with ingestion of red tilapia can be ranked as follows: former tin mining pool > concrete tank > earthen pond. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. How to reduce the effect of framing on messages about health.

    PubMed

    Garcia-Retamero, Rocio; Galesic, Mirta

    2010-12-01

    Patients must be informed about risks before any treatment can be implemented. Yet serious problems in communicating these risks occur because of framing effects. To investigate the effects of different information frames when communicating health risks to people with high and low numeracy and determine whether these effects can be countered or eliminated by using different types of visual displays (i.e., icon arrays, horizontal bars, vertical bars, or pies). Experiment on probabilistic, nationally representative US (n = 492) and German (n = 495) samples, conducted in summer 2008. Participants' risk perceptions of the medical risk expressed in positive (i.e., chances of surviving after surgery) and negative (i.e., chances of dying after surgery) terms. Although low-numeracy people are more susceptible to framing than those with high numeracy, use of visual aids is an effective method to eliminate its effects. However, not all visual aids were equally effective: pie charts and vertical and horizontal bars almost completely removed the effect of framing. Icon arrays, however, led to a smaller decrease in the framing effect. Difficulties with understanding numerical information often do not reside in the mind, but in the representation of the problem.

  7. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  8. Monitoring and exposure assessment of pesticide residues in cowpea (Vigna unguiculata L. Walp) from five provinces of southern China.

    PubMed

    Huan, Zhibo; Xu, Zhi; Luo, Jinhui; Xie, Defang

    2016-11-01

    Residues of 14 pesticides were determined in 150 cowpea samples collected in five southern Chinese provinces in 2013 and 2014.70% samples were detected one or more residues. 61.3% samples were illegal mainly because of detection of unauthorized pesticides. 14.0% samples contained more than three pesticides. Deterministic and probabilistic methods were used to assess the chronic and acute risk of pesticides in cowpea to eight subgroups of people. Deterministic assessment showed that the estimated short-term intakes (ESTIs) of carbofuran were 1199.4%-2621.9% of the acute reference doses (ARfD) while the rates were 985.9%-4114.7% using probabilistic assessment. Probabilistic assessment showed 4.2%-7.8% subjects may suffer from unacceptable acute risk from carbofuran contaminated cowpeas from the five provinces (especially children). But undue concern is not necessary, because all the estimations are based on conservative assumption. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  10. Assessment of possible airborne impact from nuclear risk sites - Part II: probabilistic analysis of atmospheric transport patterns in Euro-Arctic region

    NASA Astrophysics Data System (ADS)

    Mahura, A. G.; Baklanov, A. A.

    2003-10-01

    The probabilistic analysis of atmospheric transport patterns from most important nuclear risk sites in the Euro-Arctic region is performed employing the methodology developed within the "Arctic Risk" Project of the NARP Programme (Baklanov and Mahura, 2003). The risk sites are the nuclear power plants in the Northwest Russia, Finland, Sweden, Lithuania, United Kingdom, and Germany as well as the Novaya Zemlya test site of Russia. The geographical regions of interest are the Northern and Central European countries and Northwest Russia. In this study, the employed research tools are the trajectory model to calculate a multiyear dataset of forward trajectories that originated over the risk site locations, and a set of statistical methods (including exploratory, cluster, and probability fields analyses) for analysis of trajectory modelling results. The probabilistic analyses of trajectory modelling results for eleven sites are presented as a set of various indicators of the risk sites possible impact on geographical regions and countries of interest. The nuclear risk site possible impact (on a particular geographical region, territory, country, site, etc.) due to atmospheric transport from the site after hypothetical accidental release of radioactivity can be properly estimated based on a combined interpretation of the indicators (simple characteristics, atmospheric transport pathways, airflow and fast transport probability fields, maximum reaching distance and maximum possible impact zone, typical transport time and precipitation factor fields) for different time periods (annual, seasonal, and monthly) for any selected site (both separately for each site or grouped for several sites) in the Euro-Arctic region. Such estimation could be the useful input information for the decision-making process, risk assessment, and planning of emergency response systems for sites of nuclear, chemical, and biological danger.

  11. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  12. Probabilistic assessment of roadway departure risk in a curve

    NASA Astrophysics Data System (ADS)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  13. Probabilistic Modeling of High-Temperature Material Properties of a 5-Harness 0/90 Sylramic Fiber/ CVI-SiC/ MI-SiC Woven Composite

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh

    1998-01-01

    An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.

  14. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    NASA Technical Reports Server (NTRS)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  15. Probabilistic Asteroid Impact Risk Assessment for the Hypothetical PDC17 Impact Exercise

    NASA Technical Reports Server (NTRS)

    Wheeler, Lorien; Mathias, Donovan

    2017-01-01

    Performing impact risk assessment for the 2017 Planetary Defense Conference (PDC17) hypothetical impact exercise, to take place at the PDC17 conference, May 15-20, 2017. Impact scenarios and trajectories are developed and provided by NASA's Near Earth Objects Office at JPL (Paul Chodas). These results represent purely hypothetical impact scenarios, and do not reflect any known asteroid threat. Risk assessment was performed using the Probabilistic Asteroid Impact Risk (PAIR) model developed by the Asteroid Threat Assessment Project (ATAP) at NASA Ames Research Center. This presentation includes sample results that may be presented or used in discussions during the various stages of the impact exercisecenter dot Some cases represent alternate scenario options that may not be used during the actual impact exercise at the PDC17 conference. Updates to these initial assessments and/or additional scenario assessments may be performed throughout the impact exercise as different scenario options unfold.

  16. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less

  17. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  18. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  19. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  20. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  1. Probabilistic Risk Assessment for Bone Fracture - Bone Fracture Risk Module (BFxRM)

    NASA Technical Reports Server (NTRS)

    Licata, Angelo; Myers, Jerry G.; Lewandowski, Beth

    2013-01-01

    This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM). The overview includes an assessmnet of strenghts and limitations of the BFxRM and proposes a numebr of discussion questions to the panel regarding future development avenues for this simulation system.

  2. Review of methods for developing probabilistic risk assessments

    Treesearch

    D. A. Weinstein; P.B. Woodbury

    2010-01-01

    We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...

  3. Risk assessment for biodiversity conservation planning in Pacific Northwest forests

    Treesearch

    Becky K. Kerns; Alan Ager

    2007-01-01

    Risk assessment can provide a robust strategy for landscape-scale planning challenges associated with species conservation and habitat protection in Pacific Northwest forests. We provide an overview of quantitative and probabilistic ecological risk assessment with focus on the application of approaches and influences from the actuarial, financial, and technical...

  4. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  5. A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.

    PubMed

    Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C

    2018-05-03

    The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.

  6. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford Kuofei

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less

  7. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    DTIC Science & Technology

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault

  8. Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David

    2015-07-01

    Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less

  9. Automated segmentation of the prostate in 3D MR images using a probabilistic atlas and a spatially constrained deformable model.

    PubMed

    Martin, Sébastien; Troccaz, Jocelyne; Daanenc, Vincent

    2010-04-01

    The authors present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. The approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a probabilistic segmentation is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves toward the prostate boundaries by merging information coming from the probabilistic segmentation, an image feature model and a statistical shape model. During the evolution of the surface, the probabilistic segmentation allows the introduction of a spatial constraint that prevents the deformable surface from leaking in an unlikely configuration. The proposed method is evaluated on 36 exams that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.

  10. A Game-Theoretical Model to Improve Process Plant Protection from Terrorist Attacks.

    PubMed

    Zhang, Laobing; Reniers, Genserik

    2016-12-01

    The New York City 9/11 terrorist attacks urged people from academia as well as from industry to pay more attention to operational security research. The required focus in this type of research is human intention. Unlike safety-related accidents, security-related accidents have a deliberate nature, and one has to face intelligent adversaries with characteristics that traditional probabilistic risk assessment techniques are not capable of dealing with. In recent years, the mathematical tool of game theory, being capable to handle intelligent players, has been used in a variety of ways in terrorism risk assessment. In this article, we analyze the general intrusion detection system in process plants, and propose a game-theoretical model for security management in such plants. Players in our model are assumed to be rational and they play the game with complete information. Both the pure strategy and the mixed strategy solutions are explored and explained. We illustrate our model by an illustrative case, and find that in our case, no pure strategy but, instead, a mixed strategy Nash equilibrium exists. © 2016 Society for Risk Analysis.

  11. And the Humans Save the Day or Maybe They Ruin It: The Importance of Humans in the Loop

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Boyer, Roger; Bigler, Mark

    2017-01-01

    Flying a mission in space requires a massive commitment of resources, and without the talent and commitment of the people involved in this effort we would never leave the atmosphere of Earth. When we use the phrase "humans in the loop", it could apply to almost any endeavor since everything starts with humans developing a concept, completing the design process, building or implementing a product and using the product to achieve a goal or purpose. Narrowing the focus to spaceflights, there are a variety of individuals involved throughout the preparations for flight and the flight itself. All of the humans involved add value and support for program success. The purpose of this paper focuses on how a Probabilistic Risk Assessment (PRA) accounts for the human in the loop for potential missions using a technique called Human Reliability Analysis (HRA). Human actions can increase or decrease the overall risk via initiating events or mitigating them, thus removing the human from the loop doesn't always lower the risk.

  12. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  13. Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse

    EPA Science Inventory

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...

  14. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  15. What is in the feedback? Effect of induced happiness vs. sadness on probabilistic learning with vs. without exploration.

    PubMed

    Bakic, Jasmina; De Raedt, Rudi; Jepma, Marieke; Pourtois, Gilles

    2015-01-01

    According to dominant neuropsychological theories of affect, emotions signal salience of events and in turn facilitate a wide spectrum of response options or action tendencies. Valence of an emotional experience is pivotal here, as it alters reward and punishment processing, as well as the balance between safety and risk taking, which can be translated into changes in the exploration-exploitation trade-off during reinforcement learning (RL). To test this idea, we compared the behavioral performance of three groups of participants that all completed a variant of a standard probabilistic learning task, but who differed regarding which mood state was actually induced and maintained (happy, sad or neutral). To foster a change from an exploration to an exploitation-based mode, we removed feedback information once learning was reliably established. Although changes in mood were successful, learning performance was balanced between the three groups. Critically, when focusing on exploitation-driven learning only, they did not differ either. Moreover, mood valence did not alter the learning rate or exploration per se, when titrated using complementing computational modeling. By comparing systematically these results to our previous study (Bakic et al., 2014), we found that arousal levels did differ between studies, which might account for limited modulatory effects of (positive) mood on RL in the present case. These results challenge the assumption that mood valence alone is enough to create strong shifts in the way exploitation or exploration is eventually carried out during (probabilistic) learning. In this context, we discuss the possibility that both valence and arousal are actually necessary components of the emotional mood state to yield changes in the use and exploration of incentives cues during RL.

  16. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before and during eruptions: BET_VHst for Mt. Etna

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Scollo, Simona; Costa, Antonio; Brancato, Alfonso; Prestifilippo, Michele

    2015-04-01

    Tephra dispersal, even in small amounts, may heavily affect public health and critical infrastructures, such as airports, train and road networks, and electric power supply systems. Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at managing and mitigating the risk posed by activity during volcanic crises and during eruptions. Short-term PVHA (over time intervals in the order of hours to few days) must account for rapidly changing information coming from the monitoring system, as well as, updated wind forecast, and they must be accomplished in near-real-time. In addition, while during unrest the primary goal is to forecast potential eruptions, during eruptions it is also fundamental to correctly account for the real-time status of the eruption and of tephra dispersal, as well as its potential evolution in the short-term. Here, we present a preliminary application of BET_VHst model (Selva et al. 2014) for Mt. Etna. The model has its roots into present state deterministic procedure, and it deals with the large uncertainty that such procedures typically ignore, like uncertainty on the potential position of the vent and eruptive size, on the possible evolution of volcanological input during ongoing eruptions, as well as, on wind field. Uncertainty is treated by making use of Bayesian inference, alternative modeling procedures for tephra dispersal, and statistical mixing of long- and short-term analyses. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  17. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  18. Risk-based enteric pathogen reduction targets for non-potable and direct potable use of roof runoff, stormwater, and greywater

    EPA Science Inventory

    This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was use...

  19. Aviation Security, Risk Assessment, and Risk Aversion for Public Decisionmaking

    ERIC Educational Resources Information Center

    Stewart, Mark G.; Mueller, John

    2013-01-01

    This paper estimates risk reductions for each layer of security designed to prevent commercial passenger airliners from being commandeered by terrorists, kept under control for some time, and then crashed into specific targets. Probabilistic methods are used to characterize the uncertainty of rates of deterrence, detection, and disruption, as well…

  20. The Integrated Medical Model - A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma

    2010-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.

  1. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

  2. The ARGO Project: assessing NA-TECH risks on off-shore oil platforms

    NASA Astrophysics Data System (ADS)

    Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo

    2017-04-01

    ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.

  3. PRA and Conceptual Design

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Fuqua, Bryan; Wilson, Paul

    2013-01-01

    Once a project obtains approval, decision makers have to consider a variety of alternative paths for completing the project and meeting the project objectives. How decisions are made involves a variety of elements including: cost, experience, current technology, ideologies, politics, future needs and desires, capabilities, manpower, timing, available information, and for many ventures management needs to assess the elements of risk versus reward. The use of high level Probabilistic Risk Assessment (PRA) Models during conceptual design phases provides management with additional information during the decision making process regarding the risk potential for proposed operations and design prototypes. The methodology can be used as a tool to: 1) allow trade studies to compare alternatives based on risk, 2) determine which elements (equipment, process or operational parameters) drives the risk, and 3) provide information to mitigate or eliminate risks early in the conceptual design to lower costs. Creating system models using conceptual design proposals and generic key systems based on what is known today can provide an understanding of the magnitudes of proposed systems and operational risks and facilitates trade study comparisons early in the decision making process. Identifying the "best" way to achieve the desired results is difficult, and generally occurs based on limited information. PRA provides a tool for decision makers to explore how some decisions will affect risk before the project is committed to that path, which can ultimately save time and money.

  4. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  5. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    NASA Astrophysics Data System (ADS)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  6. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  7. Probabilistic Risk Assessment Process for High-Power Laser Operations in Outdoor Environments

    DTIC Science & Technology

    2016-01-01

    avionics data bus. In the case of a UAS-mounted laser system, the control path will additionally include a radio or satellite communications link. A remote...JBSA Fort Sam Houston, TX 78234 711 HPW/RHDO 11 . SPONSOR’S/MONITOR’S REPORT NUMBER(S) AFRL-RH-FS-JA-2015...hazard assessment pur- poses is not widespread within the laser safety community . The aim of this paper is to outline the basis of the probabilistic

  8. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  9. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  10. Regional probabilistic risk assessment of heavy metals in different environmental media and land uses: An urbanization-affected drinking water supply area

    NASA Astrophysics Data System (ADS)

    Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping

    2016-11-01

    In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.

  11. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    DOT National Transportation Integrated Search

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  12. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  13. Aggregate exposure approaches for parabens in personal care products: a case assessment for children between 0 and 3 years old

    PubMed Central

    Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke

    2014-01-01

    In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276

  14. The Use of the Direct Optimized Probabilistic Calculation Method in Design of Bolt Reinforcement for Underground and Mining Workings

    PubMed Central

    Krejsa, Martin; Janas, Petr; Yilmaz, Işık; Marschalko, Marian; Bouchal, Tomas

    2013-01-01

    The load-carrying system of each construction should fulfill several conditions which represent reliable criteria in the assessment procedure. It is the theory of structural reliability which determines probability of keeping required properties of constructions. Using this theory, it is possible to apply probabilistic computations based on the probability theory and mathematic statistics. Development of those methods has become more and more popular; it is used, in particular, in designs of load-carrying structures with the required level or reliability when at least some input variables in the design are random. The objective of this paper is to indicate the current scope which might be covered by the new method—Direct Optimized Probabilistic Calculation (DOProC) in assessments of reliability of load-carrying structures. DOProC uses a purely numerical approach without any simulation techniques. This provides more accurate solutions to probabilistic tasks, and, in some cases, such approach results in considerably faster completion of computations. DOProC can be used to solve efficiently a number of probabilistic computations. A very good sphere of application for DOProC is the assessment of the bolt reinforcement in the underground and mining workings. For the purposes above, a special software application—“Anchor”—has been developed. PMID:23935412

  15. Probabilistic simple sticker systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  16. Application of Probabilistic Modeling to Quantify the Reduction Levels of Hepatocellular Carcinoma Risk Attributable to Chronic Aflatoxins Exposure.

    PubMed

    Wambui, Joseph M; Karuri, Edward G; Ojiambo, Julia A; Njage, Patrick M K

    2017-01-01

    Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 ± 514 and 158 ± 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 ± 518 cases per annum. Consumption and contamination levels contributed significantly to the risk whereby lower age groups were most affected. Nonetheless, pre- and postharvest interventions might reduce the risk by 23.0-83.4% and 4.8-95.1%, respectively. Therefore, chronic exposure to aflatoxins increases the HCC risk in rural Kenya, but a significant reduction of the risk can be achieved by applying specific pre- and postharvest interventions.

  17. Incorporating psychological influences in probabilistic cost analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the scope and magnitude of the cost-overrun problem, the benefits are likely to be significant.« less

  18. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  19. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  20. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  1. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    PubMed

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  2. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair

    PubMed Central

    Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon

    2017-01-01

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768

  3. Analysis and probabilistic risk assessment of bioaccessible arsenic in polished and husked jasmine rice sold in Bangkok.

    PubMed

    Hensawang, Supanad; Chanpiwat, Penradee

    2018-09-01

    Food is one of the major sources of arsenic (As) exposure in humans. The objectives of this study were to determine the bioaccessible concentration of As in rice grain sold in Bangkok and to evaluate the potential health risks associated with rice consumption. Polished (n = 32) and husked (n = 17) jasmine rice were collected from local markets. In vitro digestion was performed to determine the bioaccessible As concentrations, which were used for probabilistic health risk assessments in different age groups of the population. Approximately 43.0% and 44.4% of the total As in the grain of polished and husked rice, respectively, was in the form of bioaccessible As. Significantly higher bioaccessible As concentrations were found in husked rice than in polished rice (1.5-3.8 times greater). The concentrations of bioaccessible As in polished and husked rice were lower than the Codex standard for As in rice. The average daily dose of As via rice consumption is equivalent to the daily ingestion of 2 L of water containing approximately 3.2-7.2 μg L -1 of As. Approximately 0.2%-13.7% and 10.7%-55.3% of the population may experience non-carcinogenic effects from polished and husked rice consumption, respectively. Approximately 1%-11.6% of children and 74.1%-99.8% of adults were at risk of cancer. The maximum cancer probabilities were 3 children and 6 adults in 10,000 individuals. The probabilistic risk results indicated that children and adults were at risk of both non-carcinogenic and carcinogenic effects from both types of rice consumption. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Clinical predictors of conversion to bipolar disorder in a prospective longitudinal familial high-risk sample: focus on depressive features.

    PubMed

    Frankland, Andrew; Roberts, Gloria; Holmes-Preston, Ellen; Perich, Tania; Levy, Florence; Lenroot, Rhoshel; Hadzi-Pavlovic, Dusan; Breakspear, Michael; Mitchell, Philip B

    2017-11-07

    Identifying clinical features that predict conversion to bipolar disorder (BD) in those at high familial risk (HR) would assist in identifying a more focused population for early intervention. In total 287 participants aged 12-30 (163 HR with a first-degree relative with BD and 124 controls (CONs)) were followed annually for a median of 5 years. We used the baseline presence of DSM-IV depressive, anxiety, behavioural and substance use disorders, as well as a constellation of specific depressive symptoms (as identified by the Probabilistic Approach to Bipolar Depression) to predict the subsequent development of hypo/manic episodes. At baseline, HR participants were significantly more likely to report ⩾4 Probabilistic features (40.4%) when depressed than CONs (6.7%; p < .05). Nineteen HR subjects later developed either threshold (n = 8; 4.9%) or subthreshold (n = 11; 6.7%) hypo/mania. The presence of ⩾4 Probabilistic features was associated with a seven-fold increase in the risk of 'conversion' to threshold BD (hazard ratio = 6.9, p < .05) above and beyond the fourteen-fold increase in risk related to major depressive episodes (MDEs) per se (hazard ratio = 13.9, p < .05). Individual depressive features predicting conversion were psychomotor retardation and ⩾5 MDEs. Behavioural disorders only predicted conversion to subthreshold BD (hazard ratio = 5.23, p < .01), while anxiety and substance disorders did not predict either threshold or subthreshold hypo/mania. This study suggests that specific depressive characteristics substantially increase the risk of young people at familial risk of BD going on to develop future hypo/manic episodes and may identify a more targeted HR population for the development of early intervention programs.

  5. Development of a probabilistic PCB-bioaccumulation model for six fish species in the Hudson River

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stackelberg, K. von; Menzie, C.

    1995-12-31

    In 1984 the US Environmental Protection Agency (USEPA) completed a Feasibility Study on the Hudson River that investigated remedial alternatives and issued a Record of Decision (ROD) later that year. In December 1989 USEPA decided to reassess the No Action decision for Hudson River sediments. This reassessment consists of three phases: Interim Characterization and Evaluation (Phase 1); Further Site Characterization and Analysis (Phase 2); and, Feasibility study (Phase 3). A Phase 1 report was completed in August, 1991. The team then completed a Final Work Plan for Phase 2 in September 1992. This work plan identified various PCB fate andmore » transport modeling activities to support the Hudson River PCB Reassessment Remedial Investigation and Feasibility Study (RI/FS). This talk provides a description of the development of a Probabilistic bioaccumulation models to describe the uptake of PCBs on a congener-specific basis in six fish species. The authors have developed a framework for relating body burdens of PCBs in fish to exposure concentrations in Hudson River water and sediments. This framework is used to understand historical and current relationships as well as to predict fish body burdens for future conditions under specific remediation and no action scenarios. The framework incorporates a probabilistic approach to predict distributions in PCB body burdens for selected fish species. These models can predict single population statistics such as the average expected values of PCBs under specific scenarios as well as the distribution of expected concentrations.« less

  6. Sustainable Odds: Towards Quantitative Decision Support when Relevant Probabilities are not Available

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2012-04-01

    There is, at present, no attractive foundation for quantitative probabilistic decision support in the face of model inadequacy, or given ambiguity (deep uncertainty) regarding the relative likelihood of various outcomes, known or unknown. True model error arguably precludes the extraction of objective probabilities from an ensemble of model runs drawn from an available (inadequate) model class, while the acknowledgement of incomplete understanding precludes the justified use of (if not the very formation of) an individual's subjective probabilities. An alternative approach based on Sustainable Odds is proposed and investigated. Sustainable Odds differ from "fair odds" (and are easily distinguished any claim which implying well defined probabilities) as the probabilities implied by sustainable odds summed over all outcomes is expected to exceed one. Traditionally, a person's fair odds are found by identifying the probability level at which one would happily accept either side of a bet, thus the probabilities implied by fair odds always sum to one. Knowing that one has incomplete information and perhaps even erroneous beliefs, there is no compelling reason a rational agent should accept the constraint implied by "fair odds" in any bet. Rather, a rational agent might insist on longer odds both on the event and against the event in order to account for acknowledged ignorance. Let probabilistic odds imply any set of odds for which the implied probabilities sum to one; once model error is acknowledged can one rationally demand non-probabilistic odds? The danger of using fair odds (or probabilities) in decision making is illustrated by considering the risk of ruin a cooperative insurance scheme using probabilistic odds is exposed to. Cases where knowing merely that the insurer's model is imperfect, and nothing else, is sufficient to place bets which drive the insurer to an unexpectedly early ruin are presented. Methodologies which allow the insurer to avoid this early ruin are explored; those which prevent early ruin are said to provide "sustainable odds", and it is suggested that these must be non-probabilistic. The aim here is not for the insurance cooperative to make a profit in the long run (or to form a book in any one round) but rather to increase the chance that the cooperative will not go bust, merely breaking even in the long run and thereby continuing to provide a service. In the perfect model scenario, with complete knowledge of all uncertainties and unlimited computational resources, fair odds may prove to be sustainable. The implications these results hold in the case of games against nature, which is perhaps a more relevant context for decision makers concerned with geophysical systems, are discussed. The claim that acknowledged model error makes fair (probabilistic) odds an irrational aim is considered, as are the challenges of working within the framework of sustainable (but non-probabilistic) odds.

  7. The impact of numeracy on reactions to different graphic risk presentation formats: An experimental analogue study.

    PubMed

    Wright, Alison J; Whitwell, Sophia C L; Takeichi, Chika; Hankins, Matthew; Marteau, Theresa M

    2009-02-01

    Numeracy, the ability to process basic mathematical concepts, may affect responses to graphical displays of health risk information. Displays of probabilistic risk information using grouped dots are easier to understand than displays using dispersed dots. However, dispersed dots may better convey the randomness with which health threats occur, so increasing perceived susceptibility. We hypothesized that low numeracy participants would better understand risks presented using grouped dot displays, while high numeracy participants would have good understanding, regardless of display type. Moreover, we predicted that dispersed dot displays, in contrast to grouped dot displays, would increase risk perceptions and worry only for highly numerate individuals. One hundred and forty smokers read vignettes asking them to imagine being at risk of Crohn's disease, in a 2(display type: dispersed/grouped dots) x 3(risk magnitude: 3%/6%/50%) x 2(numeracy: high/low) design. They completed measures of risk comprehension, perceived susceptibility and worry. More numerate participants had better objective risk comprehension, but this effect was not moderated by display type. There was marginally significant support for the predicted numeracy x display type interaction for worry about Crohn's disease, but not for perceived susceptibility to the condition. Dispersed dot displays somewhat increase worry in highly numerate individuals, but only numeracy influenced objective risk comprehension. The most effective display type for communicating risk information will depend on the numeracy of the population and the goal(s) of the communication.

  8. Near-Earth Phase Risk Comparison of Human Mars Campaign Architectures

    NASA Technical Reports Server (NTRS)

    Manning, Ted A.; Nejad, Hamed S.; Mattenberger, Chris

    2013-01-01

    A risk analysis of the launch, orbital assembly, and Earth-departure phases of human Mars exploration campaign architectures was completed as an extension of a probabilistic risk assessment (PRA) originally carried out under the NASA Constellation Program Ares V Project. The objective of the updated analysis was to study the sensitivity of loss-of-campaign risk to such architectural factors as composition of the propellant delivery portion of the launch vehicle fleet (Ares V heavy-lift launch vehicle vs. smaller/cheaper commercial launchers) and the degree of launcher or Mars-bound spacecraft element sparing. Both a static PRA analysis and a dynamic, event-based Monte Carlo simulation were developed and used to evaluate the probability of loss of campaign under different sparing options. Results showed that with no sparing, loss-of-campaign risk is strongly driven by launcher count and on-orbit loiter duration, favoring an all-Ares V launch approach. Further, the reliability of the all-Ares V architecture showed significant improvement with the addition of a single spare launcher/payload. Among architectures utilizing a mix of Ares V and commercial launchers, those that minimized the on-orbit loiter duration of Mars-bound elements were found to exceed the reliability of no spare all-Ares V campaign if unlimited commercial vehicle sparing was assumed

  9. How to Reduce the Effect of Framing on Messages About Health

    PubMed Central

    Galesic, Mirta

    2010-01-01

    ABSTRACT BACKGROUND Patients must be informed about risks before any treatment can be implemented. Yet serious problems in communicating these risks occur because of framing effects. OBJECTIVE To investigate the effects of different information frames when communicating health risks to people with high and low numeracy and determine whether these effects can be countered or eliminated by using different types of visual displays (i.e., icon arrays, horizontal bars, vertical bars, or pies). DESIGN Experiment on probabilistic, nationally representative US (n = 492) and German (n = 495) samples, conducted in summer 2008. OUTCOME MEASURES Participants’ risk perceptions of the medical risk expressed in positive (i.e., chances of surviving after surgery) and negative (i.e., chances of dying after surgery) terms. KEY RESULTS Although low‐numeracy people are more susceptible to framing than those with high numeracy, use of visual aids is an effective method to eliminate its effects. However, not all visual aids were equally effective: pie charts and vertical and horizontal bars almost completely removed the effect of framing. Icon arrays, however, led to a smaller decrease in the framing effect. CONCLUSIONS Difficulties with understanding numerical information often do not reside in the mind, but in the representation of the problem. PMID:20737295

  10. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  11. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  12. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  13. Assessing social and economic effects of perceived risk: Workshop summary: Draft: BWIP Repository Project. [Basalt Waste Isolation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nealey, S.M.; Liebow, E.B.

    1988-03-01

    The US Department of Energy sponsored a one-day workshop to discuss the complex dimensions of risk judgment formation and the assessment of social and economic effects of risk perceptions related to the permanent underground storage of highly radioactive waste from commercial nuclear power plants. Affected parties have publicly expressed concerns about potentially significant risk-related effects of this approach to waste management. A selective review of relevant literature in psychology, decision analysis, economics, sociology, and anthropology was completed, along with an examination of decision analysis techniques that might assist in developing suitable responses to public risk-related concerns. The workshop was organizedmore » as a forum in which a set of distinguished experts could exchange ideas and observations about the problems of characterizing the effects of risk judgments. Out of the exchange emerged the issues or themes of problems with probabilistic risk assessment techniques are evident; differences exist in the way experts and laypersons view risk, and this leads to higher levels of public concern than experts feel are justified; experts, risk managers, and decision-makers sometimes err in assessing risk and in dealing with the public; credibility and trust are important contributing factors in the formation of risk judgments; social and economic consequences of perceived risk should be properly anticipated; improvements can be made in informing the public about risk; the role of the public in risk assessment, risk management and decisions about risk should be reconsidered; and mitigation and compensation are central to resolving conflicts arising from divergent risk judgments. 1 tab.« less

  14. Shuttle Risk Progression by Flight

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri; Kahn, Joe; Thigpen, Eric; Zhu, Tony; Lo, Yohon

    2011-01-01

    Understanding the early mission risk and progression of risk as a vehicle gains insights through flight is important: . a) To the Shuttle Program to understand the impact of re-designs and operational changes on risk. . b) To new programs to understand reliability growth and first flight risk. . Estimation of Shuttle Risk Progression by flight: . a) Uses Shuttle Probabilistic Risk Assessment (SPRA) and current knowledge to calculate early vehicle risk. . b) Shows impact of major Shuttle upgrades. . c) Can be used to understand first flight risk for new programs.

  15. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.

  16. Development, Application, and Implementation of RAMCAP to Characterize Nuclear Power Plant Risk From Terrorism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaertner, John P.; Teagarden, Grant A.

    2006-07-01

    In response to increased interest in risk-informed decision making regarding terrorism, EPRI and ERIN Engineering were selected by U.S. DHS and ASME to develop and demonstrate the RAMCAP method for nuclear power plant (NPP) risk assessment. The objective is to characterize plant-specific NPP risk for risk management opportunities and to provide consistent information for DHS decision making. This paper is an update of this project presented at the American Nuclear Society (ANS) International Topical Meeting on Probabilistic Safety Analysis (PSA05) in September, 2005. The method uses a characterization of risk as a function of Consequence, Vulnerability, and Threat. For eachmore » site, worst case scenarios are developed for each of sixteen benchmark threats. Nuclear RAMCAP hypothesizes that the intent of the perpetrator is to cause offsite radiological consequences. Specific targets are the reactor core, the spent fuel pool, and nuclear spent fuel in a dry storage facility (ISFSI). Results for each scenario are presented as conditional risk for financial loss, early fatalities and early injuries. Expected consequences for each scenario are quantified, while vulnerability is estimated on a relative likelihood scale. Insights for other societal risks are provided. Although threat frequencies are not provided, target attractiveness and threat deterrence are estimated. To assure efficiency, completeness, and consistency; results are documented using standard RAMCAP Evaluator software. Trial applications were successfully performed at four plant sites. Implementation at all other U.S. commercial sites is underway, supported by the Nuclear Sector Coordinating Council (NSCC). Insights from RAMCAP results at 23 U.S. plants completed to date have been compiled and presented to the NSCC. Results are site-specific. Physical security barriers, an armed security force, preparedness for design-basis threats, rugged design against natural hazards, multiple barriers between fuel and environment, accident mitigation capability, severe accident management procedures, and offsite emergency plans are risk-beneficial against all threat types. (authors)« less

  17. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  18. Risk assessment of CST-7 proposed waste treatment and storage facilities Volume I: Limited-scope probabilistic risk assessment (PRA) of proposed CST-7 waste treatment & storage facilities. Volume II: Preliminary hazards analysis of proposed CST-7 waste storage & treatment facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, K.

    1994-06-01

    In FY 1993, the Los Alamos National Laboratory Waste Management Group [CST-7 (formerly EM-7)] requested the Probabilistic Risk and Hazards Analysis Group [TSA-11 (formerly N-6)] to conduct a study of the hazards associated with several CST-7 facilities. Among these facilities are the Hazardous Waste Treatment Facility (HWTF), the HWTF Drum Storage Building (DSB), and the Mixed Waste Receiving and Storage Facility (MWRSF), which are proposed for construction beginning in 1996. These facilities are needed to upgrade the Laboratory`s storage capability for hazardous and mixed wastes and to provide treatment capabilities for wastes in cases where offsite treatment is not availablemore » or desirable. These facilities will assist Los Alamos in complying with federal and state requlations.« less

  19. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.

    PubMed

    Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-02-24

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.

  20. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  1. ECOFRAM Terrestrial Draft Report

    EPA Pesticide Factsheets

    ECOFRAM report describing the concepts for moving from deterministic to probabilistic ecological risk assessments. ECOFRAM included scientific experts from government, academia, contract laboratories, environmental advocacy groups and industry.

  2. Safety Issues with Hydrogen as a Vehicle Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, Lee Charles; Herring, James Stephen

    1999-10-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less

  3. Safety Issues with Hydrogen as a Vehicle Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. C. Cadwallader; J. S. Herring

    1999-09-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less

  4. Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks.

    PubMed

    Saad, E W; Prokhorov, D V; Wunsch, D C

    1998-01-01

    Three networks are compared for low false alarm stock trend predictions. Short-term trends, particularly attractive for neural network analysis, can be used profitably in scenarios such as option trading, but only with significant risk. Therefore, we focus on limiting false alarms, which improves the risk/reward ratio by preventing losses. To predict stock trends, we exploit time delay, recurrent, and probabilistic neural networks (TDNN, RNN, and PNN, respectively), utilizing conjugate gradient and multistream extended Kalman filter training for TDNN and RNN. We also discuss different predictability analysis techniques and perform an analysis of predictability based on a history of daily closing price. Our results indicate that all the networks are feasible, the primary preference being one of convenience.

  5. Enhancing Cost Realism through Risk-Driven Contracting: Designing Incentive Fees Based on Probabilistic Cost Estimates

    DTIC Science & Technology

    2012-04-01

    Comparison of Management Practices in the Army, Navy, and Air Force 142Defense ARJ, April 2012, Vol. 19 No. 2 : 133–160 It appears the pendulum may be...the cost risk for requiring greater innovation. However, this natural flattening trend also leads to a potential drawback of the risk-driven

  6. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritterbusch, Stanley; Golay, Michael; Duran, Felicia

    2003-01-29

    OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.

  7. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  8. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance relative to the current allocation process is assessed in the context of whether such a model could support the proposed short-term contract based participatory process. A synthetic forecasting example is also used to explore the relative roles of forecast skill and reservoir storage in this framework.

  9. A screening level probabilistic ecological risk assessment of PAHs in sediments of San Francisco Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Febbo, E.J.; Arnold, W.R.; Biddinger, G.R.

    1995-12-31

    As part of the Regional Monitoring Program administered by the San Francisco Estuary Institute (SFEI), sediment samples were collected at 20 stations in San Francisco Bay and analyzed to determine concentrations of 43 PAHs. These data were obtained from SFEI and used to calculate the potential risk to aquatic organisms using probabilistic modeling and Monte Carlo statistical procedures. Sediment chemistry data were used in conjunction with a sediment equilibrium model, a bioconcentration model, biota-sediment accumulation factors, and critical body burden effects concentrations to assess potential risk to bivalves. Bivalves were the chosen receptors because they lack a well-developed enzymatic systemmore » for metabolizing PAHs. Thus, they more readily accumulate PAHs and represent a species at greater risk than other taxa, such as fish and crustaceans. PAHs considered in this study span a broad range of octanol-water partition coefficients. Results indicate that risk of non-polar narcotic effects from PAHs was low in the Northern Bay Area, but higher in the South Bay near the more urbanized sections of the drainage basin.« less

  10. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  11. THE PATTERN OF LONGITUDINAL CHANGE IN SERUM CREATININE AND NINETY-DAY MORTALITY AFTER MAJOR SURGERY

    PubMed Central

    Hobson, Charles E; Pardalos, Panos

    2016-01-01

    Objective Calculate mortality risk that accounts for both severity and recovery of postoperative kidney dysfunction using the pattern of longitudinal change in creatinine. Summary Background Data Although the importance of renal recovery after acute kidney injury (AKI) is increasingly recognized, the complex association that accounts for longitudinal creatinine changes and mortality is not fully described. Methods We used routinely collected clinical information for 46,299 adult patients undergoing major surgery to develop a multivariable probabilistic model optimized for non-linearity of serum creatinine time series that calculates the risk function for ninety-day mortality. We performed a 70/30 cross validation analysis to assess the accuracy of the model. Results All creatinine time series exhibited nonlinear risk function in relation to ninety-day mortality and their addition to other clinical factors improved the model discrimination. For any given severity of AKI, patients with complete renal recovery, as manifested by the return of the discharge creatinine to the baseline value, experienced a significant decrease in the odds of dying within ninety days of admission compared to patients with partial recovery. Yet, for any severity of AKI even complete renal recovery did not entirely mitigate the increased odds of dying as patients with mild AKI and complete renal recovery still had significantly increased odds for dying compared to patients without AKI (odds ratio 1,48 (95% confidence interval 1.30-1.68). Conclusions We demonstrate the nonlinear relationship between both severity and recovery of renal dysfunction and ninety-day mortality after major surgery. We have developed an easily applicable computer algorithm that calculates this complex relationship. PMID:26181482

  12. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.

  13. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  14. Dynamic Uncertain Causality Graph for Knowledge Representation and Probabilistic Reasoning: Directed Cyclic Graph and Joint Probability Distribution.

    PubMed

    Zhang, Qin

    2015-07-01

    Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.

  15. Optimization of the kernel functions in a probabilistic neural network analyzing the local pattern distribution.

    PubMed

    Galleske, I; Castellanos, J

    2002-05-01

    This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

  16. Estimating and controlling workplace risk: an approach for occupational hygiene and safety professionals.

    PubMed

    Toffel, Michael W; Birkner, Lawrence R

    2002-07-01

    The protection of people and physical assets is the objective of health and safety professionals and is accomplished through the paradigm of anticipation, recognition, evaluation, and control of risks in the occupational environment. Risk assessment concepts are not only used by health and safety professionals, but also by business and financial planners. Since meeting health and safety objectives requires financial resources provided by business and governmental managers, the hypothesis addressed here is that health and safety risk decisions should be made with probabilistic processes used in financial decision-making and which are familiar and recognizable to business and government planners and managers. This article develops the processes and demonstrates the use of incident probabilities, historic outcome information, and incremental impact analysis to estimate risk of multiple alternatives in the chemical process industry. It also analyzes how the ethical aspects of decision-making can be addressed in formulating health and safety risk management plans. It is concluded that certain, easily understood, and applied probabilistic risk assessment methods used by business and government to assess financial and outcome risk have applicability to improving workplace health and safety in three ways: 1) by linking the business and health and safety risk assessment processes to securing resources, 2) by providing an additional set of tools for health and safety risk assessment, and 3) by requiring the risk assessor to consider multiple risk management alternatives.

  17. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  18. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less

  19. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  20. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  1. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  2. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  3. Bayesian Monte Carlo and Maximum Likelihood Approach for Uncertainty Estimation and Risk Management: Application to Lake Oxygen Recovery Model

    EPA Science Inventory

    Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...

  4. A systematic risk management approach employed on the CloudSat project

    NASA Technical Reports Server (NTRS)

    Basilio, R. R.; Plourde, K. S.; Lam, T.

    2000-01-01

    The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.

  5. A PROBABALISTIC ANALYSIS TO DETERMINE ECOLOGICAL RISK DRIVERS, 10TH VOLUME ASTM STP 1403

    EPA Science Inventory

    A probabilistic analysis of exposure and effect data was used to identify chemicals most likely responsible for ecological risk. The mean and standard deviation of the natural log-transformed chemical data were used to estimate the probability of exposure for an area of concern a...

  6. Minimizing risks from spilled oil to ecosystem services using influence diagrams: The Deepwater Horizon spill response

    EPA Science Inventory

    Making inferences on risks to ecosystem services (ES) from ecological crises may be improved using decision science tools. Influence diagrams (IDs) are probabilistic networks that explicitly represent the decisions related to a problem and evidence of their influence on desired o...

  7. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.

  8. Development and application of a probabilistic method for wildfire suppression cost modeling

    Treesearch

    Matthew P. Thompson; Jessica R. Haas; Mark A. Finney; David E. Calkin; Michael S. Hand; Mark J. Browne; Martin Halek; Karen C. Short; Isaac C. Grenfell

    2015-01-01

    Wildfire activity and escalating suppression costs continue to threaten the financial health of federal land management agencies. In order to minimize and effectively manage the cost of financial risk, agencies need the ability to quantify that risk. A fundamental aim of this research effort, therefore, is to develop a process for generating risk-based metrics for...

  9. Probalistic Assessment of Radiation Risk for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2008-01-01

    For long duration missions outside of the protection of the Earth's magnetic field, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon or Earth-to-Mars transit. The large majority (90%) of SPEs have small or no health consequences because the doses are low and the particles do not penetrate to organ depths. However, there is an operational challenge to respond to events of unknown size and duration. We have developed a probabilistic approach to SPE risk assessment in support of mission design and operational planning. Using the historical database of proton measurements during the past 5 solar cycles, the functional form of hazard function of SPE occurrence per cycle was found for nonhomogeneous Poisson model. A typical hazard function was defined as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions of particle fluences for a specified mission period were simulated ranging from its 5th to 95th percentile. Organ doses from large SPEs were assessed using NASA's Baryon transport model, BRYNTRN. The SPE risk was analyzed with the organ dose distribution for the given particle fluences during a mission period. In addition to the total particle fluences of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the cancer risk associated with energetic particles for large events. The probability of exceeding the NASA 30-day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated for various SPE sizes. This probabilistic approach to SPE protection will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks in future work.

  10. Flood Risk and Asset Management

    DTIC Science & Technology

    2012-09-01

    use by third parties of results or methods presented in this report. The Company also stresses that various sections of this report rely on data...inundation probability  Levee contribution to risk The methods used in FRE have been applied to establish the National Flood Risk in England and...be noted that when undertaking high level probabilistic risk assessments in the UK, if a defence’s condition is unknown, grade 3 is applied with

  11. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  12. Comparing probabilistic microbial risk assessments for drinking water against daily rather than annualised infection probability targets.

    PubMed

    Signor, R S; Ashbolt, N J

    2009-12-01

    Some national drinking water guidelines provide guidance on how to define 'safe' drinking water. Regarding microbial water quality, a common position is that the chance of an individual becoming infected by some reference waterborne pathogen (e.g. Cryptsporidium) present in the drinking water should < 10(-4) in any year. However the instantaneous levels of risk to a water consumer vary over the course of a year, and waterborne disease outbreaks have been associated with shorter-duration periods of heightened risk. Performing probabilistic microbial risk assessments is becoming commonplace to capture the impacts of temporal variability on overall infection risk levels. A case is presented here for adoption of a shorter-duration reference period (i.e. daily) infection probability target over which to assess, report and benchmark such risks. A daily infection probability benchmark may provide added incentive and guidance for exercising control over short-term adverse risk fluctuation events and their causes. Management planning could involve outlining measures so that the daily target is met under a variety of pre-identified event scenarios. Other benefits of a daily target could include providing a platform for managers to design and assess management initiatives, as well as simplifying the technical components of the risk assessment process.

  13. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  14. Probabalistic Risk Assessment of a Turbine Disk

    NASA Astrophysics Data System (ADS)

    Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted

    Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.

  15. Bayesian probabilistic population projections for all countries.

    PubMed

    Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K

    2012-08-28

    Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.

  16. Using probabilistic terrorism risk modeling for regulatory benefit-cost analysis: application to the Western hemisphere travel initiative in the land environment.

    PubMed

    Willis, Henry H; LaTourrette, Tom

    2008-04-01

    This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.

  17. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  18. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  19. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  20. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  1. And the Human Saves the Day or Maybe They Ruin It, The Importance of Humans in the Loop

    NASA Technical Reports Server (NTRS)

    DeMott, Diana L.; Boyer, Roger L.

    2017-01-01

    Flying a mission in space requires a massive commitment of resources, and without the talent and commitment of the people involved in this effort we would never leave the atmosphere of Earth as safely as we have. When we use the phrase "humans in the loop", it could apply to almost any endeavor since everything starts with humans developing a concept, completing the design process, building or implementing a product and using the product to achieve a goal or purpose. Narrowing the focus to spaceflight, there are a variety of individuals involved throughout the preparations for flight and the flight itself. All of the humans involved add value and support for program success. The paper discusses the concepts of human involvement in technological programs, how a Probabilistic Risk Assessment (PRA) accounts for the human in the loop for potential missions using a technique called Human Reliability Analysis (HRA) and the tradeoffs between having a human in the loop or not. Human actions can increase or decrease the overall risk via initiating events or mitigating them, thus removing the human from the loop doesn't always lowers the risk.

  2. Probabilistic framework for the estimation of the adult and child toxicokinetic intraspecies uncertainty factors.

    PubMed

    Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S

    2003-12-01

    Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.

  3. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  4. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  5. New probabilistic risk assessment of ethylhexyl methoxycinnamate: Comparing the genotoxic effects of trans- and cis-EHMC.

    PubMed

    Nečasová, Anežka; Bányiová, Katarína; Literák, Jaromír; Čupr, Pavel

    2017-02-01

    Ethylhexyl methoxycinnamate (EHMC) is a widely used UV filter present in a large number of personal care products (PCPs). Under normal conditions, EHMC occurs in a mixture of two isomers: trans-EHMC and cis-EHMC in a ratio of 99:1. When exposed to sunlight, the trans isomer is transformed to the less stable cis isomer and the efficiency of the UV filter is reduced. To date, the toxicological effects of the cis-EHMC isomer remain largely unknown. We developed a completely new method for preparing cis-EHMC. An EHMC technical mixture was irradiated using a UV lamp and 98% pure cis-EHMC was isolated from the irradiated solution using column chromatography. The genotoxic effects of the isolated cis-EHMC isomer and the nonirradiated trans-EHMC were subsequently measured using two bioassays (SOS chromotest and UmuC test). In the case of trans-EHMC, significant genotoxicity was observed using both bioassays at the highest concentrations (0.5 - 4 mg mL -1 ). In the case of cis-EHMC, significant genotoxicity was only detected using the UmuC test at concentrations of 0.25 - 1 mg mL -1 . Based on these results, the NOEC was calculated for both cis- and trans-EHMC, 0.038 and 0.064 mg mL -1 , respectively. Risk assessment of dermal, oral and inhalation exposure to PCPs containing EHMC was carried out for a female population using probabilistic simulation and by using Quantitative in vitro to in vivo extrapolation (QIVIVE). The risk of cis-EHMC was found to be ∼1.7 times greater than trans-EHMC. In the case of cis-EHMC, a hazard index of 1 was exceeded in the 92nd percentile. Based on the observed differences between the isomers, EHMC application in PCPs requires detailed reassessment. Further exploration of the toxicological effects and properties of cis-EHMC is needed in order to correctly predict risks posed to humans and the environment. © 2016 Wiley Periodicals, Inc. Environ Toxicol 32: 569-580, 2017. © 2016 Wiley Periodicals, Inc.

  6. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  7. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  8. 75 FR 78777 - Advisory Committee On Reactor Safeguards; Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... Committee includes individuals experienced in reactor operations, management; probabilistic risk assessment...: December 10, 2010. Andrew L. Bates, Advisory Committee Management Officer. [FR Doc. 2010-31590 Filed 12-15...

  9. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  10. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  11. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  12. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  13. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.

  14. Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions

    NASA Technical Reports Server (NTRS)

    Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.

    2014-01-01

    Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.

  15. Cancer risk from incidental ingestion exposures to PAHs associated with coal-tar-sealed pavement

    USGS Publications Warehouse

    Williams, E. Spencer; Mahler, Barbara J.; Van Metre, Peter C.

    2012-01-01

    Recent (2009-10) studies documented significantly higher concentrations of polycyclic aromatic hydrocarbons (PAHs) in settled house dust in living spaces and soil adjacent to parking lots sealed with coal-tar-based products. To date, no studies have examined the potential human health effects of PAHs from these products in dust and soil. Here we present the results of an analysis of potential cancer risk associated with incidental ingestion exposures to PAHs in settings near coal-tar-sealed pavement. Exposures to benzo[a]pyrene equivalents were characterized across five scenarios. The central tendency estimate of excess cancer risk resulting from lifetime exposures to soil and dust from nondietary ingestion in these settings exceeded 1 × 10–4, as determined using deterministic and probabilistic methods. Soil was the primary driver of risk, but according to probabilistic calculations, reasonable maximum exposure to affected house dust in the first 6 years of life was sufficient to generate an estimated excess lifetime cancer risk of 6 × 10–5. Our results indicate that the presence of coal-tar-based pavement sealants is associated with significant increases in estimated excess lifetime cancer risk for nearby residents. Much of this calculated excess risk arises from exposures to PAHs in early childhood (i.e., 0–6 years of age).

  16. Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires

    Treesearch

    P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner

    2011-01-01

    The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...

  17. Spatially explicit risk assessment of an estuarine fish in Barataria Bay, Louisiana, following the Deepwater Horizon Oil spill: evaluating tradeoffs in model complexity and parsimony

    EPA Science Inventory

    As ecological risk assessments (ERA) move beyond organism-based determinations towards probabilistic population-level assessments, model complexity must be evaluated against the goals of the assessment, the information available to parameterize components with minimal dependence ...

  18. 77 FR 38856 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... discussion on defense-in-depth. Specifically, the SRM stated, Because the statements in Regulatory Guide 1... language to assure that the defense-in-depth philosophy is interpreted and implemented consistently. To the extent that other regulatory guidance refers to defense in depth, the relevant documents should be...

  19. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... revise the discussion on defense-in-depth. Specifically, the SRM stated, Because the statements in... precise language to assure that the defense-in-depth philosophy is interpreted and implemented consistently. To the extent that other regulatory guidance refers to defense in depth, the relevant documents...

  20. Impaired risk evaluation in people with Internet gaming disorder: fMRI evidence from a probability discounting task.

    PubMed

    Lin, Xiao; Zhou, Hongli; Dong, Guangheng; Du, Xiaoxia

    2015-01-02

    This study examined how Internet gaming disorder (IGD) subjects modulating reward and risk at a neural level under a probability-discounting task with functional magnetic resonance imaging (fMRI). Behavioral and imaging data were collected from 19 IGD subjects (22.2 ± 3.08 years) and 21 healthy controls (HC, 22.8 ± 3.5 years). Behavior results showed that IGD subjects prefer the probabilistic options to fixed ones and were associated with shorter reaction time, when comparing to HC. The fMRI results revealed that IGD subjects show decreased activation in the inferior frontal gyrus and the precentral gyrus when choosing the probabilistic options than HC. Correlations were also calculated between behavioral performances and brain activities in relevant brain regions. Both of the behavioral performance and fMRI results indicate that people with IGD show impaired risk evaluation, which might be the reason why IGD subjects continue playing online games despite the risks of widely known negative consequence. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Overview of the SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division Activities and Technical Projects

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.

  2. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  3. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. JOe; Ronald L. Boring

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less

  4. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  5. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil

    PubMed Central

    Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-01-01

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315

  6. Probabilistic risk assessment of the effect of acidified seawater on development stages of sea urchin (Strongylocentrotus droebachiensis).

    PubMed

    Chen, Wei-Yu; Lin, Hsing-Chieh

    2018-05-01

    Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.

  7. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor built components.

  8. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  10. [MaRS Project

    NASA Technical Reports Server (NTRS)

    Aruljothi, Arunvenkatesh

    2016-01-01

    The Space Exploration Division of the Safety and Mission Assurances Directorate is responsible for reducing the risk to Human Space Flight Programs by providing system safety, reliability, and risk analysis. The Risk & Reliability Analysis branch plays a part in this by utilizing Probabilistic Risk Assessment (PRA) and Reliability and Maintainability (R&M) tools to identify possible types of failure and effective solutions. A continuous effort of this branch is MaRS, or Mass and Reliability System, a tool that was the focus of this internship. Future long duration space missions will have to find a balance between the mass and reliability of their spare parts. They will be unable take spares of everything and will have to determine what is most likely to require maintenance and spares. Currently there is no database that combines mass and reliability data of low level space-grade components. MaRS aims to be the first database to do this. The data in MaRS will be based on the hardware flown on the International Space Stations (ISS). The components on the ISS have a long history and are well documented, making them the perfect source. Currently, MaRS is a functioning excel workbook database; the backend is complete and only requires optimization. MaRS has been populated with all the assemblies and their components that are used on the ISS; the failures of these components are updated regularly. This project was a continuation on the efforts of previous intern groups. Once complete, R&M engineers working on future space flight missions will be able to quickly access failure and mass data on assemblies and components, allowing them to make important decisions and tradeoffs.

  11. A Targeted Health Risk Assessment Following the Deepwater Horizon Oil Spill: Polycyclic Aromatic Hydrocarbon Exposure in Vietnamese-American Shrimp Consumers

    PubMed Central

    Frickel, Scott; Nguyen, Daniel; Bui, Tap; Echsner, Stephen; Simon, Bridget R.; Howard, Jessi L.; Miller, Kent; Wickliffe, Jeffrey K.

    2014-01-01

    Background: The Deepwater Horizon oil spill of 2010 prompted concern about health risks among seafood consumers exposed to polycyclic aromatic hydrocarbons (PAHs) via consumption of contaminated seafood. Objective: The objective of this study was to conduct population-specific probabilistic health risk assessments based on consumption of locally harvested white shrimp (Litopenaeus setiferus) among Vietnamese Americans in southeast Louisiana. Methods: We conducted a survey of Vietnamese Americans in southeast Louisiana to evaluate shrimp consumption, preparation methods, and body weight among shrimp consumers in the disaster-impacted region. We also collected and chemically analyzed locally harvested white shrimp for 81 individual PAHs. We combined the PAH levels (with accepted reference doses) found in the shrimp with the survey data to conduct Monte Carlo simulations for probabilistic noncancer health risk assessments. We also conducted probabilistic cancer risk assessments using relative potency factors (RPFs) to estimate cancer risks from the intake of PAHs from white shrimp. Results: Monte Carlo simulations were used to generate hazard quotient distributions for noncancer health risks, reported as mean ± SD, for naphthalene (1.8 × 10–4 ± 3.3 × 10–4), fluorene (2.4 × 10–5 ± 3.3 × 10–5), anthracene (3.9 × 10–6 ± 5.4 × 10–6), pyrene (3.2 × 10–5 ± 4.3 × 10–5), and fluoranthene (1.8 × 10–4 ± 3.3 × 10–4). A cancer risk distribution, based on RPF-adjusted PAH intake, was also generated (2.4 × 10–7 ± 3.9 × 10–7). Conclusions: The risk assessment results show no acute health risks or excess cancer risk associated with consumption of shrimp containing the levels of PAHs detected in our study, even among frequent shrimp consumers. Citation: Wilson MJ, Frickel S, Nguyen D, Bui T, Echsner S, Simon BR, Howard JL, Miller K, Wickliffe JK. 2015. A targeted health risk assessment following the Deepwater Horizon Oil Spill: polycyclic aromatic hydrocarbon exposure in Vietnamese-American shrimp consumers. Environ Health Perspect 123:152–159; http://dx.doi.org/10.1289/ehp.1408684 PMID:25333566

  12. Probabilistic analysis of the influence of the bonding degree of the stem-cement interface in the performance of cemented hip prostheses.

    PubMed

    Pérez, M A; Grasa, J; García-Aznar, J M; Bea, J A; Doblaré, M

    2006-01-01

    The long-term behavior of the stem-cement interface is one of the most frequent topics of discussion in the design of cemented total hip replacements, especially with regards to the process of damage accumulation in the cement layer. This effect is analyzed here comparing two different situations of the interface: completely bonded and debonded with friction. This comparative analysis is performed using a probabilistic computational approach that considers the variability and uncertainty of determinant factors that directly compromise the damage accumulation in the cement mantle. This stochastic technique is based on the combination of probabilistic finite elements (PFEM) and a cumulative damage approach known as B-model. Three random variables were considered: muscle and joint contact forces at the hip (both for walking and stair climbing), cement damage and fatigue properties of the cement. The results predicted that the regions with higher failure probability in the bulk cement are completely different depending on the stem-cement interface characteristics. In a bonded interface, critical sites appeared at the distal and medial parts of the cement, while for debonded interfaces, the critical regions were found distally and proximally. In bonded interfaces, the failure probability was higher than in debonded ones. The same conclusion may be established for stair climbing in comparison with walking activity.

  13. Developing a synthetic national population to investigate the impact of different cardiovascular disease risk management strategies: A derivation and validation study

    PubMed Central

    Jackson, Rod

    2017-01-01

    Background Many national cardiovascular disease (CVD) risk factor management guidelines now recommend that drug treatment decisions should be informed primarily by patients’ multi-variable predicted risk of CVD, rather than on the basis of single risk factor thresholds. To investigate the potential impact of treatment guidelines based on CVD risk thresholds at a national level requires individual level data representing the multi-variable CVD risk factor profiles for a country’s total adult population. As these data are seldom, if ever, available, we aimed to create a synthetic population, representing the joint CVD risk factor distributions of the adult New Zealand population. Methods and results A synthetic population of 2,451,278 individuals, representing the actual age, gender, ethnicity and social deprivation composition of people aged 30–84 years who completed the 2013 New Zealand census was generated using Monte Carlo sampling. Each ‘synthetic’ person was then probabilistically assigned values of the remaining cardiovascular disease (CVD) risk factors required for predicting their CVD risk, based on data from the national census national hospitalisation and drug dispensing databases and a large regional cohort study, using Monte Carlo sampling and multiple imputation. Where possible, the synthetic population CVD risk distributions for each non-demographic risk factor were validated against independent New Zealand data sources. Conclusions We were able to develop a synthetic national population with realistic multi-variable CVD risk characteristics. The construction of this population is the first step in the development of a micro-simulation model intended to investigate the likely impact of a range of national CVD risk management strategies that will inform CVD risk management guideline updates in New Zealand and elsewhere. PMID:28384217

  14. Developing a synthetic national population to investigate the impact of different cardiovascular disease risk management strategies: A derivation and validation study.

    PubMed

    Knight, Josh; Wells, Susan; Marshall, Roger; Exeter, Daniel; Jackson, Rod

    2017-01-01

    Many national cardiovascular disease (CVD) risk factor management guidelines now recommend that drug treatment decisions should be informed primarily by patients' multi-variable predicted risk of CVD, rather than on the basis of single risk factor thresholds. To investigate the potential impact of treatment guidelines based on CVD risk thresholds at a national level requires individual level data representing the multi-variable CVD risk factor profiles for a country's total adult population. As these data are seldom, if ever, available, we aimed to create a synthetic population, representing the joint CVD risk factor distributions of the adult New Zealand population. A synthetic population of 2,451,278 individuals, representing the actual age, gender, ethnicity and social deprivation composition of people aged 30-84 years who completed the 2013 New Zealand census was generated using Monte Carlo sampling. Each 'synthetic' person was then probabilistically assigned values of the remaining cardiovascular disease (CVD) risk factors required for predicting their CVD risk, based on data from the national census national hospitalisation and drug dispensing databases and a large regional cohort study, using Monte Carlo sampling and multiple imputation. Where possible, the synthetic population CVD risk distributions for each non-demographic risk factor were validated against independent New Zealand data sources. We were able to develop a synthetic national population with realistic multi-variable CVD risk characteristics. The construction of this population is the first step in the development of a micro-simulation model intended to investigate the likely impact of a range of national CVD risk management strategies that will inform CVD risk management guideline updates in New Zealand and elsewhere.

  15. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  16. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary W.

    2014-01-01

    The International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020, although there are ongoing efforts to extend ISS life cycle through 2028. The EOL for the ISS will require deorbiting the ISS. This will be the largest manmade object ever to be de-orbited therefore safely deorbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  17. Potential Impacts of Accelerated Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. R.; Vail, L. W.

    2016-05-31

    This research project is part of the U.S. Nuclear Regulatory Commission’s (NRC’s) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of developing a risk-informed licensing framework for flood hazards and design standards at proposed new facilities and significance determination tools for evaluating potential deficiencies related to flood protection at operating facilities. The PFHA plan aims to build upon recent advances in deterministic, probabilistic, and statistical modeling of extreme precipitation events to develop regulatory tools and guidance for NRC staff with regard to PFHA for nuclear facilities. The tools and guidance developed under the PFHA plan will support and enhancemore » NRC’s capacity to perform thorough and efficient reviews of license applications and license amendment requests. They will also support risk-informed significance determination of inspection findings, unusual events, and other oversight activities.« less

  18. Expectancy Learning from Probabilistic Input by Infants

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2013-01-01

    Across the first few years of life, infants readily extract many kinds of regularities from their environment, and this ability is thought to be central to development in a number of domains. Numerous studies have documented infants’ ability to recognize deterministic sequential patterns. However, little is known about the processes infants use to build and update representations of structure in time, and how infants represent patterns that are not completely predictable. The present study investigated how infants’ expectations fora simple structure develope over time, and how infants update their representations with new information. We measured 12-month-old infants’ anticipatory eye movements to targets that appeared in one of two possible locations. During the initial phase of the experiment, infants either saw targets that appeared consistently in the same location (Deterministic condition) or probabilistically in either location, with one side more frequent than the other (Probabilistic condition). After this initial divergent experience, both groups saw the same sequence of trials for the rest of the experiment. The results show that infants readily learn from both deterministic and probabilistic input, with infants in both conditions reliably predicting the most likely target location by the end of the experiment. Local context had a large influence on behavior: infants adjusted their predictions to reflect changes in the target location on the previous trial. This flexibility was particularly evident in infants with more variable prior experience (the Probabilistic condition). The results provide some of the first data showing how infants learn in real time. PMID:23439947

  19. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  20. Probabilistic characterization of wind turbine blades via aeroelasticity and spinning finite element formulation

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, R. Andrew

    2012-04-01

    Wind energy is an increasingly important component of this nation's renewable energy portfolio, however safe and economical wind turbine operation is a critical need to ensure continued adoption. Safe operation of wind turbine structures requires not only information regarding their condition, but their operational environment. Given the difficulty inherent in SHM processes for wind turbines (damage detection, location, and characterization), some uncertainty in conditional assessment is expected. Furthermore, given the stochastic nature of the loading on turbine structures, a probabilistic framework is appropriate to characterize their risk of failure at a given time. Such information will be invaluable to turbine controllers, allowing them to operate the structures within acceptable risk profiles. This study explores the characterization of the turbine loading and response envelopes for critical failure modes of the turbine blade structures. A framework is presented to develop an analytical estimation of the loading environment (including loading effects) based on the dynamic behavior of the blades. This is influenced by behaviors including along and across-wind aero-elastic effects, wind shear gradient, tower shadow effects, and centrifugal stiffening effects. The proposed solution includes methods that are based on modal decomposition of the blades and require frequent updates to the estimated modal properties to account for the time-varying nature of the turbine and its environment. The estimated demand statistics are compared to a code-based resistance curve to determine a probabilistic estimate of the risk of blade failure given the loading environment.

  1. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  2. Subsea release of oil from a riser: an ecological risk assessment.

    PubMed

    Nazir, Muddassir; Khan, Faisal; Amyotte, Paul; Sadiq, Rehan

    2008-10-01

    This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC(95%)) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC(95%) and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC(5%)) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC(95%) to PNEC(5%), into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision-making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.

  3. Environmental risk assessment of white phosphorus from the use of munitions - a probabilistic approach.

    PubMed

    Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager

    2010-03-15

    White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.

  4. Environmental risk assessment of engineered nano-SiO2 , nano iron oxides, nano-CeO2 , nano-Al2 O3 , and quantum dots.

    PubMed

    Wang, Yan; Nowack, Bernd

    2018-05-01

    Many research studies have endeavored to investigate the ecotoxicological hazards of engineered nanomaterials (ENMs). However, little is known regarding the actual environmental risks of ENMs, combining both hazard and exposure data. The aim of the present study was to quantify the environmental risks for nano-Al 2 O 3 , nano-SiO 2 , nano iron oxides, nano-CeO 2 , and quantum dots by comparing the predicted environmental concentrations (PECs) with the predicted-no-effect concentrations (PNECs). The PEC values of these 5 ENMs in freshwaters in 2020 for northern Europe and southeastern Europe were taken from a published dynamic probabilistic material flow analysis model. The PNEC values were calculated using probabilistic species sensitivity distribution (SSD). The order of the PNEC values was quantum dots < nano-CeO 2  < nano iron oxides < nano-Al 2 O 3  < nano-SiO 2 . The risks posed by these 5 ENMs were demonstrated to be in the reverse order: nano-Al 2 O 3  > nano-SiO 2  > nano iron oxides > nano-CeO 2  > quantum dots. However, all risk characterization values are 4 to 8 orders of magnitude lower than 1, and no risk was therefore predicted for any of the investigated ENMs at the estimated release level in 2020. Compared to static models, the dynamic material flow model allowed us to use PEC values based on a more complex parameterization, considering a dynamic input over time and time-dependent release of ENMs. The probabilistic SSD approach makes it possible to include all available data to estimate hazards of ENMs by considering the whole range of variability between studies and material types. The risk-assessment approach is therefore able to handle the uncertainty and variability associated with the collected data. The results of the present study provide a scientific foundation for risk-based regulatory decisions of the investigated ENMs. Environ Toxicol Chem 2018;37:1387-1395. © 2018 SETAC. © 2018 SETAC.

  5. APPLICATION AND EVALUATION OF AN AGGREGATE PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL FOR QUANTIFYING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS

    EPA Science Inventory

    Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...

  6. 77 FR 58590 - Determining Technical Adequacy of Probabilistic Risk Assessment for Risk-Informed License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... reactors or for activities associated with review of applications for early site permits and combined licenses (COL) for the Office of New Reactors (NRO). DATES: The effective date of this SRP update is... Rulemaking Web Site: Go to http://www.regulations.gov and search for Docket ID NRC-2010-0138. Address...

  7. Validation of a probabilistic post-fire erosion model

    Treesearch

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller

    2016-01-01

    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  8. Quick probabilistic binary image matching: changing the rules of the game

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2016-09-01

    A Probabilistic Matching Model for Binary Images (PMMBI) is presented that predicts the probability of matching binary images with any level of similarity. The model relates the number of mappings, the amount of similarity between the images and the detection confidence. We show the advantage of using a probabilistic approach to matching in similarity space as opposed to a linear search in size space. With PMMBI a complete model is available to predict the quick detection of dissimilar binary images. Furthermore, the similarity between the images can be measured to a good degree if the images are highly similar. PMMBI shows that only a few pixels need to be compared to detect dissimilarity between images, as low as two pixels in some cases. PMMBI is image size invariant; images of any size can be matched at the same quick speed. Near-duplicate images can also be detected without much difficulty. We present tests on real images that show the prediction accuracy of the model.

  9. Identifying a Probabilistic Boolean Threshold Network From Samples.

    PubMed

    Melkman, Avraham A; Cheng, Xiaoqing; Ching, Wai-Ki; Akutsu, Tatsuya

    2018-04-01

    This paper studies the problem of exactly identifying the structure of a probabilistic Boolean network (PBN) from a given set of samples, where PBNs are probabilistic extensions of Boolean networks. Cheng et al. studied the problem while focusing on PBNs consisting of pairs of AND/OR functions. This paper considers PBNs consisting of Boolean threshold functions while focusing on those threshold functions that have unit coefficients. The treatment of Boolean threshold functions, and triplets and -tuplets of such functions, necessitates a deepening of the theoretical analyses. It is shown that wide classes of PBNs with such threshold functions can be exactly identified from samples under reasonable constraints, which include: 1) PBNs in which any number of threshold functions can be assigned provided that all have the same number of input variables and 2) PBNs consisting of pairs of threshold functions with different numbers of input variables. It is also shown that the problem of deciding the equivalence of two Boolean threshold functions is solvable in pseudopolynomial time but remains co-NP complete.

  10. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state

    PubMed Central

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-01-01

    Background Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning — the “jumping to conclusion” bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). Methods We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. Results We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Limitations Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not pre-psychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. Conclusion As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk. PMID:25622039

  11. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state.

    PubMed

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-05-01

    Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning - the "jumping to conclusion" bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not prepsychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk.

  12. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.

  13. MODELED ESTIMATES OF CHLORPYRIFOS EXPOSURE AND DOSE FOR THE MINNESOTA AND ARIZONA NHEXAS POPULATIONS

    EPA Science Inventory

    This paper presents a probabilistic, multimedia, multipathway exposure model and assessment for chlorpyrifos developed as part of the National Human Exposure Assessment Survey (NHEXAS). The model was constructed using available information prior to completion of the NHEXAS stu...

  14. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  16. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  17. Assessment of uncertainty in discrete fracture network modeling using probabilistic distribution method.

    PubMed

    Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan

    2017-11-01

    There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .

  18. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  19. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562

  20. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  1. Methodology to identify risk-significant components for inservice inspection and testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues.

  2. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following an idea by Festger and Walter, 2002. These quasi steady-state flow fields are cast into a geostatistical Monte Carlo framework to admit and evaluate the influence of parameter uncertainty on the delineation process. Furthermore, this framework enables conditioning on observed data with any conditioning scheme, such as rejection sampling, Ensemble Kalman Filters, etc. To further reduce the computational load, we use the reverse formulation of advective-dispersive transport. We simulate the reverse transport by particle tracking random walk in order to avoid numerical dispersion to account for well arrival times.

  3. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  4. Time-varying loss forecast for an earthquake scenario in Basel, Switzerland

    NASA Astrophysics Data System (ADS)

    Herrmann, Marcus; Zechar, Jeremy D.; Wiemer, Stefan

    2014-05-01

    When an unexpected earthquake occurs, people suddenly want advice on how to cope with the situation. The 2009 L'Aquila quake highlighted the significance of public communication and pushed the usage of scientific methods to drive alternative risk mitigation strategies. For instance, van Stiphout et al. (2010) suggested a new approach for objective evacuation decisions on short-term: probabilistic risk forecasting combined with cost-benefit analysis. In the present work, we apply this approach to an earthquake sequence that simulated a repeat of the 1356 Basel earthquake, one of the most damaging events in Central Europe. A recent development to benefit society in case of an earthquake are probabilistic forecasts of the aftershock occurrence. But seismic risk delivers a more direct expression of the socio-economic impact. To forecast the seismic risk on short-term, we translate aftershock probabilities to time-varying seismic hazard and combine this with time-invariant loss estimation. Compared with van Stiphout et al. (2010), we use an advanced aftershock forecasting model and detailed settlement data to allow us spatial forecasts and settlement-specific decision-making. We quantify the risk forecast probabilistically in terms of human loss. For instance one minute after the M6.6 mainshock, the probability for an individual to die within the next 24 hours is 41 000 times higher than the long-term average; but the absolute value remains at minor 0.04 %. The final cost-benefit analysis adds value beyond a pure statistical approach: it provides objective statements that may justify evacuations. To deliver supportive information in a simple form, we propose a warning approach in terms of alarm levels. Our results do not justify evacuations prior to the M6.6 mainshock, but in certain districts afterwards. The ability to forecast the short-term seismic risk at any time-and with sufficient data anywhere-is the first step of personal decision-making and raising risk awareness among the public. Reference Van Stiphout, T., S. Wiemer, and W. Marzocchi (2010). 'Are short-term evacuations warranted? Case of the 2009 L'Aquila earthquake'. In: Geophysical Research Letters 37.6, pp. 1-5. url: http://onlinelibrary.wiley.com/doi/10.1029/ 2009GL042352/abstract.

  5. Science Goals in Radiation Protection for Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francs A.

    2008-01-01

    Space radiation presents major challenges to future missions to the Earth s moon or Mars. Health risks of concern include cancer, degenerative and performance risks to the central nervous system, heart and lens, and the acute radiation syndromes. The galactic cosmic rays (GCR) contain high energy and charge (HZE) nuclei, which have been shown to cause qualitatively distinct biological damage compared to terresterial radiation, such as X-rays or gamma-rays, causing risk estimates to be highly uncertain. The biological effects of solar particle events (SPE) are similar to terresterial radiation except for their biological dose-rate modifiers; however the onset and size of SPEs are difficult to predict. The high energies of GCR reduce the effectiveness of shielding, while SPE s can be shielded however the current gap in radiobiological knowledge hinders optimization. Methods used to project risks on Earth must be modified because of the large uncertainties in projecting health risks from space radiation, and thus impact mission requirements and costs. We describe NASA s unique approach to radiation safety that applies probabilistic risk assessments and uncertainty based criteria within the occupational health program for astronauts and to mission design. The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in radiation risk projection models. Exploration science goals in radiation protection are centered on ground-based research to achieve the necessary biological knowledge, and in the development of new technologies to improve SPE monitoring and optimize shielding. Radiobiology research is centered on a ground based program investigating the radiobiology of high-energy protons and HZE nuclei at the NASA Space Radiation Laboratory (NSRL) located at DoE s Brookhaven National Laboratory in Upton, NY. We describe recent NSRL results that are closing the knowledge gap in HZE radiobiology and improving exploration risk estimates. Linking probabilistic risk assessment to research goals makes it possible to express risk management objectives in terms of quantitative metrics, which include the number of days in space without exceeding a given risk level within well defined confidence limits, and probabilistic assessments of the effectiveness of design trade spaces such as material type, mass, solar cycle, crew selection criteria, and biological countermeasures. New research in SPE alert and risk assessment, individual radiation sensitivity, and biological countermeasure development are described.

  6. 76 FR 70768 - Biweekly Notice; Applications and Amendments to Facility Operating Licenses Involving No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... perform a probabilistic risk evaluation using the guidance contained in NRC approved NEI [Nuclear Energy... Issue Summary 2003-18, Supplement 2, ``Use of Nuclear Energy Institute (NEI) 99-01, Methodology for...

  7. Ranking of sabotage/tampering avoidance technology alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, W.B.; Tabatabai, A.S.; Powers, T.B.

    1986-01-01

    Pacific Northwest Laboratory conducted a study to evaluate alternatives to the design and operation of nuclear power plants, emphasizing a reduction of their vulnerability to sabotage. Estimates of core melt accident frequency during normal operations and from sabotage/tampering events were used to rank the alternatives. Core melt frequency for normal operations was estimated using sensitivity analysis of results of probabilistic risk assessments. Core melt frequency for sabotage/tampering was estimated by developing a model based on probabilistic risk analyses, historic data, engineering judgment, and safeguards analyses of plant locations where core melt events could be initiated. Results indicate the most effectivemore » alternatives focus on large areas of the plant, increase safety system redundancy, and reduce reliance on single locations for mitigation of transients. Less effective options focus on specific areas of the plant, reduce reliance on some plant areas for safe shutdown, and focus on less vulnerable targets.« less

  8. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary

    2014-01-01

    Although there are ongoing efforts to extend the ISS life cycle through 2028, the International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020. The EOL for the ISS will require de-orbiting the ISS. This will be the largest manmade object ever to be de-orbited, therefore safely de-orbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities, debris mapping etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  9. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.

  10. System Maturity Indices for Decision Support in the Defense Acquisition Process

    DTIC Science & Technology

    2008-04-23

    technologies, but was to be used as ontology for contracting support (Sadin, Povinelli , & Rosen, 1989), thus TRL does not address: A complete...via probabilistic solution discovery. Reliability Engineering & System Safety. In press. Sadin, S.R., Povinelli , F.P., & Rosen, R. (1989). The NASA

  11. Research Analysis on MOOC Course Dropout and Retention Rates

    ERIC Educational Resources Information Center

    Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena

    2016-01-01

    This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…

  12. Impact of refining the assessment of dietary exposure to cadmium in the European adult population.

    PubMed

    Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan

    2013-01-01

    Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.

  13. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  14. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  15. Uncertainty and probability in wildfire management decision support: An example from the United States [Chapter 4

    Treesearch

    Matthew Thompson; David Calkin; Joe H. Scott; Michael Hand

    2017-01-01

    Wildfire risk assessment is increasingly being adopted to support federal wildfire management decisions in the United States. Existing decision support systems, specifically the Wildland Fire Decision Support System (WFDSS), provide a rich set of probabilistic and risk‐based information to support the management of active wildfire incidents. WFDSS offers a wide range...

  16. A simulation of probabilistic wildfire risk components for the continental United States

    Treesearch

    Mark A. Finney; Charles W. McHugh; Isaac C. Grenfell; Karin L. Riley; Karen C. Short

    2011-01-01

    This simulation research was conducted in order to develop a large-fire risk assessment system for the contiguous land area of the United States. The modeling system was applied to each of 134 Fire Planning Units (FPUs) to estimate burn probabilities and fire size distributions. To obtain stable estimates of these quantities, fire ignition and growth was simulated for...

  17. Constellation Probabilistic Risk Assessment (PRA): Design Consideration for the Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis

    2010-01-01

    Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.

  18. Health risk assessment of ochratoxin A for all age-sex strata in a market economy.

    PubMed

    Kuiper-Goodman, T; Hilts, C; Billiard, S M; Kiparissis, Y; Richard, I D K; Hayward, S

    2010-02-01

    In order to manage risk of ochratoxin A (OTA) in foods, we re-evaluated the tolerable daily intake (TDI), derived the negligible cancer risk intake (NCRI), and conducted a probabilistic risk assessment. A new approach was developed to derive 'usual' probabilistic exposure in the presence of highly variable occurrence data, such as encountered with low levels of OTA. Canadian occurrence data were used for various raw food commodities or finished foods and were combined with US Department of Agriculture (USDA) food consumption data, which included data on infants and young children. Both variability and uncertainty in input data were considered in the resulting exposure estimates for various age/sex strata. Most people were exposed to OTA on a daily basis. Mean adjusted exposures for all age-sex groups were generally below the NCRI of 4 ng OTA kg bw(-1), except for 1-4-year-olds as a result of their lower body weight. For children, the major contributors of OTA were wheat-based foods followed by oats, rice, and raisins. Beer, coffee, and wine also contributed to total OTA exposure in older individuals. Predicted exposure to OTA decreased when European Commission maximum limits were applied to the occurrence data. The impact on risk for regular eaters of specific commodities was also examined.

  19. Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication

    NASA Astrophysics Data System (ADS)

    Thompson, Kimberly M.

    Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.

  20. A probabilistic risk assessment for deployed military personnel after the implementation of the "Leishmaniasis Control Program" at Tallil Air Base, Iraq.

    PubMed

    Schleier, Jerome J; Davis, Ryan S; Barber, Loren M; Macedo, Paula A; Peterson, Robert K D

    2009-05-01

    Leishmaniasis has been of concern to the U.S. military and has re-emerged in importance because of recent deployments to the Middle East. We conducted a retrospective probabilistic risk assessment for military personnel potentially exposed to insecticides during the "Leishmaniasis Control Plan" (LCP) undertaken in 2003 at Tallil Air Base, Iraq. We estimated acute and subchronic risks from resmethrin, malathion, piperonyl butoxide (PBO), and pyrethrins applied using a truck-mounted ultra-low-volume (ULV) sprayer and lambda-cyhalothrin, cyfluthrin, bifenthrin, chlorpyrifos, and cypermethrin used for residual sprays. We used the risk quotient (RQ) method for our risk assessment (estimated environmental exposure/toxic endpoint) and set the RQ level of concern (LOC) at 1.0. Acute RQs for truck-mounted ULV and residual sprays ranged from 0.00007 to 33.3 at the 95th percentile. Acute exposure to lambda-cyhalothrin, bifenthrin, and chlorpyrifos exceeded the RQ LOC. Subchronic RQs for truck-mounted ULV and residual sprays ranged from 0.00008 to 32.8 at the 95th percentile. Subchronic exposures to lambda-cyhalothrin and chlorpyrifos exceeded the LOC. However, estimated exposures to lambda-cyhalothrin, bifenthrin, and chlorpyrifos did not exceed their respective no observed adverse effect levels.

  1. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  2. Probabilistic fiber tracking of the language and motor white matter pathways of the supplementary motor area (SMA) in patients with brain tumors.

    PubMed

    Jenabi, Mehrnaz; Peck, Kyung K; Young, Robert J; Brennan, Nicole; Holodny, Andrei I

    2014-12-01

    Accurate localization of anatomically and functionally separate SMA tracts is important to improve planning prior to neurosurgery. Using fMRI and probabilistic DTI techniques, we assessed the connectivity between the frontal language area (Broca's area) and the rostral pre-SMA (language SMA) and caudal SMA proper (motor SMA). Twenty brain tumor patients completed motor and language fMRI paradigms and DTI. Peaks of functional activity in the language SMA, motor SMA and Broca's area were used to define seed regions for probabilistic tractography. fMRI and probabilistic tractography identified separate and unique pathways connecting the SMA to Broca's area - the language SMA pathway and the motor SMA pathway. For all subjects, the language SMA pathway had a larger number of voxels (P<0.0001) and higher connectivity (P<0.0001) to Broca's area than did the motor SMA pathway. In each patient, the number of voxels was greater in the language and motor SMA pathways than in background pathways (P<0.0001). No differences were found between patients with ipsilateral and those with contralateral tumors for either the language SMA pathway (degree of connectivity: P<0.36; number of voxels: 0.35) or the motor SMA pathway (degree of connectivity, P<0.28; number of voxels, P<0.74). Probabilistic tractography can identify unique white matter tracts that connect language SMA and motor SMA to Broca's area. The language SMA is more significantly connected to Broca's area than is the motor subdivision of the SMA proper. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  3. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  4. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Wahl, Thomas; Plant, Nathaniel G.; Long, Joseph W.

    2016-05-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  5. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    USGS Publications Warehouse

    Plant, Nathaniel G.; Wahl, Thomas; Long, Joseph W.

    2016-01-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  6. Noradrenergic modulation of risk/reward decision making.

    PubMed

    Montes, David R; Stopper, Colin M; Floresco, Stan B

    2015-08-01

    Catecholamine transmission modulates numerous cognitive and reward-related processes that can subserve more complex functions such as cost/benefit decision making. Dopamine has been shown to play an integral role in decisions involving reward uncertainty, yet there is a paucity of research investigating the contributions of noradrenaline (NA) transmission to these functions. The present study was designed to elucidate the contribution of NA to risk/reward decision making in rats, assessed with a probabilistic discounting task. We examined the effects of reducing noradrenergic transmission with the α2 agonist clonidine (10-100 μg/kg), and increasing activity at α2A receptor sites with the agonist guanfacine (0.1-1 mg/kg), the α2 antagonist yohimbine (1-3 mg/kg), and the noradrenaline transporter (NET) inhibitor atomoxetine (0.3-3 mg/kg) on probabilistic discounting. Rats chose between a small/certain reward and a larger/risky reward, wherein the probability of obtaining the larger reward either decreased (100-12.5 %) or increased (12.5-100 %) over a session. In well-trained rats, clonidine reduced risky choice by decreasing reward sensitivity, whereas guanfacine did not affect choice behavior. Yohimbine impaired adjustments in decision biases as reward probability changed within a session by altering negative feedback sensitivity. In a subset of rats that displayed prominent discounting of probabilistic rewards, the lowest dose of atomoxetine increased preference for the large/risky reward when this option had greater long-term utility. These data highlight an important and previously uncharacterized role for noradrenergic transmission in mediating different aspects of risk/reward decision making and mediating reward and negative feedback sensitivity.

  7. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    NASA Astrophysics Data System (ADS)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  8. Applying adverse outcome pathways and species sensitivity-weighted distribution to predicted-no-effect concentration derivation and quantitative ecological risk assessment for bisphenol A and 4-nonylphenol in aquatic environments: A case study on Tianjin City, China.

    PubMed

    Wang, Ying; Na, Guangshui; Zong, Humin; Ma, Xindong; Yang, Xianhai; Mu, Jingli; Wang, Lijun; Lin, Zhongsheng; Zhang, Zhifeng; Wang, Juying; Zhao, Jinsong

    2018-02-01

    Adverse outcome pathways (AOPs) are a novel concept that effectively considers the toxic modes of action and guides the ecological risk assessment of chemicals. To better use toxicity data including biochemical or molecular responses and mechanistic data, we further developed a species sensitivity-weighted distribution (SSWD) method for bisphenol A and 4-nonylphenol. Their aquatic predicted-no-effect concentrations (PNECs) were derived using the log-normal statistical extrapolation method. We calculated aquatic PNECs of bisphenol A and 4-nonylphenol with values of 4.01 and 0.721 µg/L, respectively. The ecological risk of each chemical in different aquatic environments near Tianjin, China, a coastal municipality along the Bohai Sea, was characterized by hazard quotient and probabilistic risk quotient assessment techniques. Hazard quotients of 7.02 and 5.99 at 2 municipal sewage sites using all of the endpoints were observed for 4-nonylphenol, which indicated high ecological risks posed by 4-nonylphenol to aquatic organisms, especially endocrine-disrupting effects. Moreover, a high ecological risk of 4-nonylphenol was indicated based on the probabilistic risk quotient method. The present results show that combining the SSWD method and the AOP concept could better protect aquatic organisms from adverse effects such as endocrine disruption and could decrease uncertainty in ecological risk assessment. Environ Toxicol Chem 2018;37:551-562. © 2017 SETAC. © 2017 SETAC.

  9. Space Radiation Cancer Risks and Uncertainities for Different Mission Time Periods

    NASA Technical Reports Server (NTRS)

    Kim,Myung-Hee Y.; Cucinotta, Francis A.

    2012-01-01

    Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which includes high energy protons and high charge and energy (HZE) nuclei. For long duration missions, space radiation presents significant health risks including cancer mortality. Probabilistic risk assessment (PRA) is essential for radiation protection of crews on long term space missions outside of the protection of the Earth s magnetic field and for optimization of mission planning and costs. For the assessment of organ dosimetric quantities and cancer risks, the particle spectra at each critical body organs must be characterized. In implementing a PRA approach, a statistical model of SPE fluence was developed, because the individual SPE occurrences themselves are random in nature while the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. An overall cumulative probability of a GCR environment for a specified mission period was estimated for the temporal characterization of the GCR environment represented by the deceleration potential (theta). Finally, this probabilistic approach to space radiation cancer risk was coupled with a model of the radiobiological factors and uncertainties in projecting cancer risks. Probabilities of fatal cancer risk and 95% confidence intervals will be reported for various periods of space missions.

  10. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  11. Advanced quality systems : probabilistic optimization for profit (Prob.O.Prof) software

    DOT National Transportation Integrated Search

    2009-04-01

    Contractors constantly have to make decisions regarding how to maximize profit and minimize risk on paving projects. With more and more States adopting incentive/disincentive pay adjustment provisions for quality, as measured by various acceptance qu...

  12. Reduced activation in ventral striatum and ventral tegmental area during probabilistic decision-making in schizophrenia.

    PubMed

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Esslinger, Christine; Schilling, Claudia; Schirmbeck, Frederike; Englisch, Susanne; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2014-07-01

    Patients with schizophrenia suffer from deficits in monitoring and controlling their own thoughts. Within these so-called metacognitive impairments, alterations in probabilistic reasoning might be one cognitive phenomenon disposing to delusions. However, so far little is known about alterations in associated brain functionality. A previously established task for functional magnetic resonance imaging (fMRI), which requires a probabilistic decision after a variable amount of stimuli, was applied to 23 schizophrenia patients and 28 healthy controls matched for age, gender and educational levels. We compared activation patterns during decision-making under conditions of certainty versus uncertainty and evaluated the process of final decision-making in ventral striatum (VS) and ventral tegmental area (VTA). We replicated a pre-described extended cortical activation pattern during probabilistic reasoning. During final decision-making, activations in several fronto- and parietocortical areas, as well as in VS and VTA became apparent. In both of these regions schizophrenia patients showed a significantly reduced activation. These results further define the network underlying probabilistic decision-making. The observed hypo-activation in regions commonly associated with dopaminergic neurotransmission fits into current concepts of disrupted prediction error signaling in schizophrenia and suggests functional links to reward anticipation. Forthcoming studies with patients at risk for psychosis and drug-naive first episode patients are necessary to elucidate the development of these findings over time and the interplay with associated clinical symptoms. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. INTEGRATION OF RELIABILITY WITH MECHANISTIC THERMALHYDRAULICS: REPORT ON APPROACH AND TEST PROBLEM RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. S. Schroeder; R. W. Youngblood

    The Risk-Informed Safety Margin Characterization (RISMC) pathway of the Light Water Reactor Sustainability Program is developing simulation-based methods and tools for analyzing safety margin from a modern perspective. [1] There are multiple definitions of 'margin.' One class of definitions defines margin in terms of the distance between a point estimate of a given performance parameter (such as peak clad temperature), and a point-value acceptance criterion defined for that parameter (such as 2200 F). The present perspective on margin is that it relates to the probability of failure, and not just the distance between a nominal operating point and a criterion.more » In this work, margin is characterized through a probabilistic analysis of the 'loads' imposed on systems, structures, and components, and their 'capacity' to resist those loads without failing. Given the probabilistic load and capacity spectra, one can assess the probability that load exceeds capacity, leading to component failure. Within the project, we refer to a plot of these probabilistic spectra as 'the logo.' Refer to Figure 1 for a notional illustration. The implications of referring to 'the logo' are (1) RISMC is focused on being able to analyze loads and spectra probabilistically, and (2) calling it 'the logo' tacitly acknowledges that it is a highly simplified picture: meaningful analysis of a given component failure mode may require development of probabilistic spectra for multiple physical parameters, and in many practical cases, 'load' and 'capacity' will not vary independently.« less

  14. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  15. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    PubMed

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  16. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    PubMed Central

    Kaufman, Leyla V.; Wright, Mark G.

    2017-01-01

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments. PMID:28686180

  17. The Availability Heuristic: A Redux

    ERIC Educational Resources Information Center

    Rubel, Laurie H.

    2007-01-01

    This article reports on a subset of results from a larger study which examined middle and high school students' probabilistic reasoning. Students in grades 5, 7, 9, and 11 at a boys' school (n = 173) completed a Probability Inventory, which required students to answer and justify their responses to ten items. Supplemental clinical interviews…

  18. DYT1 dystonia increases risk taking in humans.

    PubMed

    Arkadir, David; Radulescu, Angela; Raymond, Deborah; Lubarr, Naomi; Bressman, Susan B; Mazzoni, Pietro; Niv, Yael

    2016-06-01

    It has been difficult to link synaptic modification to overt behavioral changes. Rodent models of DYT1 dystonia, a motor disorder caused by a single gene mutation, demonstrate increased long-term potentiation and decreased long-term depression in corticostriatal synapses. Computationally, such asymmetric learning predicts risk taking in probabilistic tasks. Here we demonstrate abnormal risk taking in DYT1 dystonia patients, which is correlated with disease severity, thereby supporting striatal plasticity in shaping choice behavior in humans.

  19. Interim reliability evaluation program, Browns Ferry 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1981-01-01

    Probabilistic risk analysis techniques, i.e., event tree and fault tree analysis, were utilized to provide a risk assessment of the Browns Ferry Nuclear Plant Unit 1. Browns Ferry 1 is a General Electric boiling water reactor of the BWR 4 product line with a Mark 1 (drywell and torus) containment. Within the guidelines of the IREP Procedure and Schedule Guide, dominant accident sequences that contribute to public health and safety risks were identified and grouped according to release categories.

  20. Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    2011-01-01

    An effective risk assessment system is needed to address the threat posed by an active or passive insider who, acting alone or in collusion, could attempt diversion or theft of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) is a self-assessment or inspection tool utilizing probabilistic risk assessment (PRA) methodology to calculate the system effectiveness of a nuclear facility's material protection, control, and accountability (MPC&A) system. The MSET process is divided into four distinct and separate parts: (1) Completion of the questionnaire that assembles information about the operations of every aspect of the MPC&A system; (2)more » Conversion of questionnaire data into numeric values associated with risk; (3) Analysis of the numeric data utilizing the MPC&A fault tree and the SAPHIRE computer software; and (4) Self-assessment using the MSET reports to perform the effectiveness evaluation of the facility's MPC&A system. The process should lead to confirmation that mitigating features of the system effectively minimize the threat, or it could lead to the conclusion that system improvements or upgrades are necessary to achieve acceptable protection against the threat. If the need for system improvements or upgrades is indicated when the system is analyzed, MSET provides the capability to evaluate potential or actual system improvements or upgrades. A facility's MC&A system can be evaluated at a point in time. The system can be reevaluated after upgrades are implemented or after other system changes occur. The total system or specific subareas within the system can be evaluated. Areas of potential system improvement can be assessed to determine where the most beneficial and cost-effective improvements should be made. Analyses of risk importance factors show that sustainability is essential for optimal performance and reveals where performance degradation has the greatest impact on total system risk. The risk importance factors show the amount of risk reduction achievable with potential upgrades and the amount of risk reduction achieved after upgrades are completed. Applying the risk assessment tool gives support to budget prioritization by showing where budget support levels must be sustained for MC&A functions most important to risk. Results of the risk assessment are also useful in supporting funding justifications for system improvements that significantly reduce system risk. The functional model, the system risk assessment tool, and the facility evaluation questionnaire are valuable educational tools for MPC&A personnel. These educational tools provide a framework for ongoing dialogue between organizations regarding the design, development, implementation, operation, assessment, and sustainability of MPC&A systems. An organization considering the use of MSET as an analytical tool for evaluating the effectiveness of its MPC&A system will benefit from conducting a complete MSET exercise at an existing nuclear facility.« less

  1. Comparative Effectiveness of Personalized Lifestyle Management Strategies for Cardiovascular Disease Risk Reduction.

    PubMed

    Chu, Paula; Pandya, Ankur; Salomon, Joshua A; Goldie, Sue J; Hunink, M G Myriam

    2016-03-29

    Evidence shows that healthy diet, exercise, smoking interventions, and stress reduction reduce cardiovascular disease risk. We aimed to compare the effectiveness of these lifestyle interventions for individual risk profiles and determine their rank order in reducing 10-year cardiovascular disease risk. We computed risks using the American College of Cardiology/American Heart Association Pooled Cohort Equations for a variety of individual profiles. Using published literature on risk factor reductions through diverse lifestyle interventions-group therapy for stopping smoking, Mediterranean diet, aerobic exercise (walking), and yoga-we calculated the risk reduction through each of these interventions to determine the strategy associated with the maximum benefit for each profile. Sensitivity analyses were conducted to test the robustness of the results. In the base-case analysis, yoga was associated with the largest 10-year cardiovascular disease risk reductions (maximum absolute reduction 16.7% for the highest-risk individuals). Walking generally ranked second (max 11.4%), followed by Mediterranean diet (max 9.2%), and group therapy for smoking (max 1.6%). If the individual was a current smoker and successfully quit smoking (ie, achieved complete smoking cessation), then stopping smoking yielded the largest reduction. Probabilistic and 1-way sensitivity analysis confirmed the demonstrated trend. This study reports the comparative effectiveness of several forms of lifestyle modifications and found smoking cessation and yoga to be the most effective forms of cardiovascular disease prevention. Future research should focus on patient adherence to personalized therapies, cost-effectiveness of these strategies, and the potential for enhanced benefit when interventions are performed simultaneously rather than as single measures. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  2. Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowlen, Steven Patrick; Hyslop, J. S.

    2010-04-01

    Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less

  3. A probabilistic seismic model for the European Arctic

    NASA Astrophysics Data System (ADS)

    Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes

    2011-01-01

    The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.

  4. Information and problem report usage in system saftey engineering division

    NASA Technical Reports Server (NTRS)

    Morrissey, Stephen J.

    1990-01-01

    Five basic problems or question areas are examined. They are as follows: (1) Evaluate adequacy of current problem/performance data base; (2) Evaluate methods of performing trend analysis; (3) Methods and sources of data for probabilistic risk assessment; and (4) How is risk assessment documentation upgraded and/or updated. The fifth problem was to provide recommendations for each of the above four areas.

  5. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  6. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications here used as examples of the pyPHaz potentialities, that are focused on a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra dispersal and fallout applied to the municipality of Naples.

  7. 76 FR 55717 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment The ACRS Subcommittee on Reliability and PRA will hold a meeting [[Page 55718

  8. Influence diagrams as oil spill decision science tools

    EPA Science Inventory

    Making inferences on risks to ecosystem services (ES) from ecological crises can be more reliably handled using decision science tools. Influence diagrams (IDs) are probabilistic networks that explicitly represent the decisions related to a problem and evidence of their influence...

  9. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  10. Complete mechanical characterization of an external hexagonal implant connection: in vitro study, 3D FEM, and probabilistic fatigue.

    PubMed

    Prados-Privado, María; Gehrke, Sérgio A; Rojo, Rosa; Prados-Frutos, Juan Carlos

    2018-06-11

    The aim of this study was to fully characterize the mechanical behavior of an external hexagonal implant connection (ø3.5 mm, 10-mm length) with an in vitro study, a three-dimensional finite element analysis, and a probabilistic fatigue study. Ten implant-abutment assemblies were randomly divided into two groups, five were subjected to a fracture test to obtain the maximum fracture load, and the remaining were exposed to a fatigue test with 360,000 cycles of 150 ± 10 N. After mechanical cycling, all samples were attached to the torque-testing machine and the removal torque was measured in Newton centimeters. A finite element analysis (FEA) was then executed in ANSYS® to verify all results obtained in the mechanical tests. Finally, due to the randomness of the fatigue phenomenon, a probabilistic fatigue model was computed to obtain the probability of failure associated with each cycle load. FEA demonstrated that the fracture corresponded with a maximum stress of 2454 MPa obtained in the in vitro fracture test. Mean life was verified by the three methods. Results obtained by the FEA, the in vitro test, and the probabilistic approaches were in accordance. Under these conditions, no mechanical etiology failure is expected to occur up to 100,000 cycles. Graphical abstract ᅟ.

  11. Probabilistic Neighborhood-Based Data Collection Algorithms for 3D Underwater Acoustic Sensor Networks.

    PubMed

    Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo

    2017-02-08

    Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency.

  12. Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging

    NASA Astrophysics Data System (ADS)

    Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.

    2017-10-01

    Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.

  13. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  14. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    PubMed

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  15. Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework

    USGS Publications Warehouse

    Sojda, Richard S.; Towler, Erin; Roberts, Mike; Rajagopalan, Balaji

    2013-01-01

    [1] Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very “sharp”). Synthetic forecasts show that a modest “sharpening” can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.

  16. Probabilistic framework for assessing the arsenic exposure risk from cooked fish consumption.

    PubMed

    Ling, Min-Pei; Wu, Chiu-Hua; Chen, Szu-Chieh; Chen, Wei-Yu; Chio, Chia-Pin; Cheng, Yi-Hsien; Liao, Chung-Min

    2014-12-01

    Geogenic arsenic (As) contamination of groundwater is a major ecological and human health problem in southwestern and northeastern coastal areas of Taiwan. Here, we present a probabilistic framework for assessing the human health risks from consuming raw and cooked fish that were cultured in groundwater As-contaminated ponds in Taiwan by linking a physiologically based pharmacokinetics model and a Weibull dose-response model. Results indicate that As levels in baked, fried, and grilled fish were higher than those of raw fish. Frying resulted in the greatest increase in As concentration, followed by grilling, with baking affecting the As concentration the least. Simulation results show that, following consumption of baked As-contaminated fish, the health risk to humans is <10(-6) excess bladder cancer risk level for lifetime exposure; as the incidence ratios of liver and lung cancers are generally acceptable at risk ranging from 10(-6) to 10(-4), the consumption of baked As-contaminated fish is unlikely to pose a significant risk to human health. However, contaminated fish cooked by frying resulted in significant health risks, showing the highest cumulative incidence ratios of liver cancer. We also show that males have higher cumulative incidence ratio of liver cancer than females. We found that although cooking resulted in an increase for As levels in As-contaminated fish, the risk to human health of consuming baked fish is nevertheless acceptable. We suggest the adoption of baking as a cooking method and warn against frying As-contaminated fish. We conclude that the concentration of contaminants after cooking should be taken into consideration when assessing the risk to human health.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  18. Probability concepts in quality risk management.

    PubMed

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management.

  19. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    NASA Astrophysics Data System (ADS)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.

  20. Portfolios of quantum algorithms.

    PubMed

    Maurer, S M; Hogg, T; Huberman, B A

    2001-12-17

    Quantum computation holds promise for the solution of many intractable problems. However, since many quantum algorithms are stochastic in nature they can find the solution of hard problems only probabilistically. Thus the efficiency of the algorithms has to be characterized by both the expected time to completion and the associated variance. In order to minimize both the running time and its uncertainty, we show that portfolios of quantum algorithms analogous to those of finance can outperform single algorithms when applied to the NP-complete problems such as 3-satisfiability.

  1. Numeracy of multiple sclerosis patients: A comparison of patients from the PERCEPT study to a German probabilistic sample.

    PubMed

    Gaissmaier, Wolfgang; Giese, Helge; Galesic, Mirta; Garcia-Retamero, Rocio; Kasper, Juergen; Kleiter, Ingo; Meuth, Sven G; Köpke, Sascha; Heesen, Christoph

    2018-01-01

    A shared decision-making approach is suggested for multiple sclerosis (MS) patients. To properly evaluate benefits and risks of different treatment options accordingly, MS patients require sufficient numeracy - the ability to understand quantitative information. It is unknown whether MS affects numeracy. Therefore, we investigated whether patients' numeracy was impaired compared to a probabilistic national sample. As part of the larger prospective, observational, multicenter study PERCEPT, we assessed numeracy for a clinical study sample of German MS patients (N=725) with a standard test and compared them to a German probabilistic sample (N=1001), controlling for age, sex, and education. Within patients, we assessed whether disease variables (disease duration, disability, annual relapse rate, cognitive impairment) predicted numeracy beyond these demographics. MS patients showed a comparable level of numeracy as the probabilistic national sample (68.9% vs. 68.5% correct answers, P=0.831). In both samples, numeracy was higher for men and the highly educated. Disease variables did not predict numeracy beyond demographics within patients, and predictability was generally low. This sample of MS patients understood quantitative information on the same level as the general population. There is no reason to withhold quantitative information from MS patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Management of Risk and Uncertainty in Systems Acquisition: Proceedings of the 1983 Defense Risk and Uncertainty Workshop Held at Fort Belvoir, Virginia on 13-15 July 1983

    DTIC Science & Technology

    1983-07-15

    categories, however, represent the reality in major acquisition and are often overlooked. Although Figure 1 does not reflect tne dynamics and Interactions...networking and improved computer capabili- ties probabilistic network simulation became a reality . The Naval Sea Systems Command became involved in...reasons for using the WBS are plain: 1. Virtually all risk-prone activities are performed by the contractor, not Government. Government is responsible

  3. A review of post-nuclear-catastrophe management

    NASA Astrophysics Data System (ADS)

    Nifenecker, Hervé

    2015-07-01

    The purpose of this paper is to make radioactive risk more generally understandable. To that end, we compare it to smoking tobacco. Further, we show that the concept of loss of life expectancy permits a quantitative comparison between various aggressions. The demystification of radioactive risk should lead to basic changes in post-catastrophe management, allowing victims to choose whether or not to leave contaminated areas. A less emotional appreciation of radioactive risks should lead to the adaptation of legal practices when dealing with probabilistic situations.

  4. Risk assessment considerations with regard to the potential impacts of pesticides on endangered species.

    PubMed

    Brain, Richard A; Teed, R Scott; Bang, JiSu; Thorbek, Pernille; Perine, Jeff; Peranginangin, Natalia; Kim, Myoungwoo; Valenti, Ted; Chen, Wenlin; Breton, Roger L; Rodney, Sara I; Moore, Dwayne R J

    2015-01-01

    Simple, deterministic screening-level assessments that are highly conservative by design facilitate a rapid initial screening to determine whether a pesticide active ingredient has the potential to adversely affect threatened or endangered species. If a worst-case estimate of pesticide exposure is below a very conservative effects metric (e.g., the no observed effects concentration of the most sensitive tested surrogate species) then the potential risks are considered de minimis and unlikely to jeopardize the existence of a threatened or endangered species. Thus by design, such compounded layers of conservatism are intended to minimize potential Type II errors (failure to reject a false null hypothesis of de minimus risk), but correspondingly increase Type I errors (falsely reject a null hypothesis of de minimus risk). Because of the conservatism inherent in screening-level risk assessments, higher-tier scientific information and analyses that provide additional environmental realism can be applied in cases where a potential risk has been identified. This information includes community-level effects data, environmental fate and exposure data, monitoring data, geospatial location and proximity data, species biology data, and probabilistic exposure and population models. Given that the definition of "risk" includes likelihood and magnitude of effect, higher-tier risk assessments should use probabilistic techniques that more accurately and realistically characterize risk. Moreover, where possible and appropriate, risk assessments should focus on effects at the population and community levels of organization rather than the more traditional focus on the organism level. This document provides a review of some types of higher-tier data and assessment refinements available to more accurately and realistically evaluate potential risks of pesticide use to threatened and endangered species. © 2014 SETAC.

  5. Assessing the inhalation cancer risk of particulate matter bound polycyclic aromatic hydrocarbons (PAHs) for the elderly in a retirement community of a mega city in North China.

    PubMed

    Han, Bin; Liu, Yating; You, Yan; Xu, Jia; Zhou, Jian; Zhang, Jiefeng; Niu, Can; Zhang, Nan; He, Fei; Ding, Xiao; Bai, Zhipeng

    2016-10-01

    Assessment of the health risks resulting from exposure to ambient polycyclic aromatic hydrocarbons (PAHs) is limited by the lack of environmental exposure data among different subpopulations. To assess the exposure cancer risk of particulate carcinogenic polycyclic aromatic hydrocarbon pollution for the elderly, this study conducted a personal exposure measurement campaign for particulate PAHs in a community of Tianjin, a city in northern China. Personal exposure samples were collected from the elderly in non-heating (August-September, 2009) and heating periods (November-December, 2009), and 12 PAHs individuals were analyzed for risk estimation. Questionnaire and time-activity log were also recorded for each person. The probabilistic risk assessment model was integrated with Toxic Equivalent Factors (TEFs). Considering that the estimation of the applied dose for a given air pollutant is dependent on the inhalation rate, the inhalation rate from both EPA exposure factor book was applied to calculate the carcinogenic risk in this study. Monte Carlo simulation was used as a probabilistic risk assessment model, and risk simulation results indicated that the inhalation-ILCR values for both male and female subjects followed a lognormal distribution with a mean of 4.81 × 10 -6 and 4.57 × 10 -6 , respectively. Furthermore, the 95 % probability lung cancer risks were greater than the USEPA acceptable level of 10 -6 for both men and women through the inhalation route, revealing that exposure to PAHs posed an unacceptable potential cancer risk for the elderly in this study. As a result, some measures should be taken to reduce PAHs pollution and the exposure level to decrease the cancer risk for the general population, especially for the elderly.

  6. Two approaches to incorporate clinical data uncertainty into multiple criteria decision analysis for benefit-risk assessment of medicinal products.

    PubMed

    Wen, Shihua; Zhang, Lanju; Yang, Bo

    2014-07-01

    The Problem formulation, Objectives, Alternatives, Consequences, Trade-offs, Uncertainties, Risk attitude, and Linked decisions (PrOACT-URL) framework and multiple criteria decision analysis (MCDA) have been recommended by the European Medicines Agency for structured benefit-risk assessment of medicinal products undergoing regulatory review. The objective of this article was to provide solutions to incorporate the uncertainty from clinical data into the MCDA model when evaluating the overall benefit-risk profiles among different treatment options. Two statistical approaches, the δ-method approach and the Monte-Carlo approach, were proposed to construct the confidence interval of the overall benefit-risk score from the MCDA model as well as other probabilistic measures for comparing the benefit-risk profiles between treatment options. Both approaches can incorporate the correlation structure between clinical parameters (criteria) in the MCDA model and are straightforward to implement. The two proposed approaches were applied to a case study to evaluate the benefit-risk profile of an add-on therapy for rheumatoid arthritis (drug X) relative to placebo. It demonstrated a straightforward way to quantify the impact of the uncertainty from clinical data to the benefit-risk assessment and enabled statistical inference on evaluating the overall benefit-risk profiles among different treatment options. The δ-method approach provides a closed form to quantify the variability of the overall benefit-risk score in the MCDA model, whereas the Monte-Carlo approach is more computationally intensive but can yield its true sampling distribution for statistical inference. The obtained confidence intervals and other probabilistic measures from the two approaches enhance the benefit-risk decision making of medicinal products. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Medical Updates Number 5 to the International Space Station Probability Risk Assessment (PRA) Model Using the Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Doug; Bauman, David; Johnson-Throop, Kathy

    2011-01-01

    The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.

  8. Exposure Assessment Tools by Tiers and Types - Deterministic and Probabilistic Assessments

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  9. Bayesian networks improve causal environmental assessments for evidence-based policy

    EPA Science Inventory

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the p...

  10. PROBABILISTIC RISK ASSESSMENT FOR THE EFFECTS OF SOLAR ULTRAVIOLET RADIATION ON AMPHIBIANS

    EPA Science Inventory

    Several studies have demonstrated that exposure to solar ultraviolet (UV) radiation can cause elevated mortality and an increased prevalence of eye and limb malformations in developing amphibian larvae. From these observations scientists have hypothesized that recent increases in...

  11. [Health insurance for infants and infant vaccination related to forced-displacement in Colombia].

    PubMed

    Ruiz-Rodríguez, Myriam; Vera-Cala, Lina M; López-Barbosa, Nahyr

    2008-01-01

    Determining vaccination coverage amongst children (<5 years old) having multiple socio-economic risk factors and their relationship to insurance status. This was a cross-sectional study of 514 families from urban settlements receiving people displaced by the armed conflict in 4 municipalities in the Santander department ( Colombia ). The households were selected by probabilistic sampling, using proportional modelling by municipality. Immunisation data was collected from vaccination cards; interviews provided socio-demographic data. The dependent variable consisted of having the complete vaccination scheme by age according to the official Ministry of Social Protection's programme. The probability of being vaccinated was modelled by a logistical regression, adjusted for sociodemographic variables. 369 children were studied, of whom 48,8 % belonged to families displaced by the armed conflict. 46,1 % of the people being interviewed presented their vaccination cards. Contrary to what had been expected, only 21,2 % of those having a vaccination card were insured and 22,9 % of them had a complete vaccination scheme for their age. The probability of having a complete vaccination scheme for those individuals who were covered by the subsidised health system was 2,4 times higher when compared to those who were not insured (p=0.042). Low vaccination coverage indicated barriers for people living in conditions of poverty and displacement obtaining health services and low insurance coverage suggested faults in health insurance policies addressing similar populations.

  12. Integrated Risk-Informed Decision-Making for an ALMR PRISM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muhlheim, Michael David; Belles, Randy; Denning, Richard S.

    Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less

  13. A Bayesian Framework for Analysis of Pseudo-Spatial Models of Comparable Engineered Systems with Application to Spacecraft Anomaly Prediction Based on Precedent Data

    NASA Astrophysics Data System (ADS)

    Ndu, Obibobi Kamtochukwu

    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.

  14. Risk of DDT residue in maize consumed by infants as complementary diet in southwest Ethiopia.

    PubMed

    Mekonen, Seblework; Lachat, Carl; Ambelu, Argaw; Steurbaut, Walter; Kolsteren, Patrick; Jacxsens, Liesbeth; Wondafrash, Mekitie; Houbraken, Michael; Spanoghe, Pieter

    2015-04-01

    Infants in Ethiopia are consuming food items such as maize as a complementary diet. However, this may expose infants to toxic contaminants like DDT. Maize samples were collected from the households visited during a consumption survey and from markets in Jimma zone, southwestern Ethiopia. The residues of total DDT and its metabolites were analyzed using the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method combined with dispersive solid phase extraction cleanup (d-SPE). Deterministic and probabilistic methods of analysis were applied to determine the consumer exposure of infants to total DDT. The results from the exposure assessment were compared with the health based guidance value in this case the provisional tolerable daily intake (PTDI). All maize samples (n=127) were contaminated by DDT, with a mean concentration of 1.770 mg/kg, which was far above the maximum residue limit (MRL). The mean and 97.5 percentile (P 97.5) estimated daily intake of total DDT for consumers were respectively 0.011 and 0.309 mg/kg bw/day for deterministic and 0.011 and 0.083 mg/kg bw/day for probabilistic exposure assessment. For total infant population (consumers and non-consumers), the 97.5 percentile estimated daily intake were 0.265 and 0.032 mg/kg bw/day from the deterministic and probabilistic exposure assessments, respectively. Health risk estimation revealed that, the mean and 97.5 percentile for consumers, and 97.5 percentile estimated daily intake of total DDT for total population were above the PTDI. Therefore, in Ethiopia, the use of maize as complementary food for infants may pose a health risk due to DDT residue. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. The Effect of Forest Management Strategy on Carbon Storage and Revenue in Western Washington: A Probabilistic Simulation of Tradeoffs.

    PubMed

    Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J

    2017-01-01

    The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO 2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation. © 2016 Society for Risk Analysis.

  16. Anemia risk in relation to lead exposure in lead-related manufacturing.

    PubMed

    Hsieh, Nan-Hung; Chung, Shun-Hui; Chen, Szu-Chieh; Chen, Wei-Yu; Cheng, Yi-Hsien; Lin, Yi-Jun; You, Su-Han; Liao, Chung-Min

    2017-05-05

    Lead-exposed workers may suffer adverse health effects under the currently regulated blood lead (BPb) levels. However, a probabilistic assessment about lead exposure-associated anemia risk is lacking. The goal of this study was to examine the association between lead exposure and anemia risk among factory workers in Taiwan. We first collated BPb and indicators of hematopoietic function data via health examination records that included 533 male and 218 female lead-exposed workers between 2012 and 2014. We used benchmark dose (BMD) modeling to estimate the critical effect doses for detection of abnormal indicators. A risk-based probabilistic model was used to characterize the potential hazard of lead poisoning for job-specific workers by hazard index (HI). We applied Bayesian decision analysis to determine whether BMD could be implicated as a suitable BPb standard. Our results indicated that HI for total lead-exposed workers was 0.78 (95% confidence interval: 0.50-1.26) with risk occurrence probability of 11.1%. The abnormal risk of anemia indicators for male and female workers could be reduced, respectively, by 67-77% and 86-95% by adopting the suggested BPb standards of 25 and 15 μg/dL. We conclude that cumulative exposure to lead in the workplace was significantly associated with anemia risk. This study suggests that current BPb standard needs to be better understood for the application of lead-exposed population protection in different scenarios to provide a novel standard for health management. Low-level lead exposure risk is an occupational and public health problem that should be paid more attention.

  17. Aquatic predicted no-effect concentrations of 16 polycyclic aromatic hydrocarbons and their ecological risks in surface seawater of Liaodong Bay, China.

    PubMed

    Wang, Ying; Wang, Juying; Mu, Jingli; Wang, Zhen; Cong, Yi; Yao, Ziwei; Lin, Zhongsheng

    2016-06-01

    Polycyclic aromatic hydrocarbons (PAHs), a class of ubiquitous pollutants in marine environments, exhibit moderate to high adverse effects on aquatic organisms and humans. However, the lack of PAH toxicity data for aquatic organism has limited evaluation of their ecological risks. In the present study, aquatic predicted no-effect concentrations (PNECs) of 16 priority PAHs were derived based on species sensitivity distribution models, and their probabilistic ecological risks in seawater of Liaodong Bay, Bohai Sea, China, were assessed. A quantitative structure-activity relationship method was adopted to achieve the predicted chronic toxicity data for the PNEC derivation. Good agreement for aquatic PNECs of 8 PAHs based on predicted and experimental chronic toxicity data was observed (R(2)  = 0.746), and the calculated PNECs ranged from 0.011 µg/L to 205.3 µg/L. A significant log-linear relationship also existed between the octanol-water partition coefficient and PNECs derived from experimental toxicity data (R(2)  = 0.757). A similar order of ecological risks for the 16 PAH species in seawater of Liaodong Bay was found by probabilistic risk quotient and joint probability curve methods. The individual high ecological risk of benzo[a]pyrene, benzo[b]fluoranthene, and benz[a]anthracene needs to be determined. The combined ecological risk of PAHs in seawater of Liaodong Bay calculated by the joint probability curve method was 13.9%, indicating a high risk as a result of co-exposure to PAHs. Environ Toxicol Chem 2016;35:1587-1593. © 2015 SETAC. © 2015 SETAC.

  18. A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tie, S. S.; Martini, P.; Mudd, D.

    In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less

  19. A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey

    DOE PAGES

    Tie, S. S.; Martini, P.; Mudd, D.; ...

    2017-02-15

    In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less

  20. The Effects of Climate Model Similarity on Local, Risk-Based Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, S.; Brown, C. M.

    2014-12-01

    The climate science community has recently proposed techniques to develop probabilistic projections of climate change from ensemble climate model output. These methods provide a means to incorporate the formal concept of risk, i.e., the product of impact and probability, into long-term planning assessments for local systems under climate change. However, approaches for pdf development often assume that different climate models provide independent information for the estimation of probabilities, despite model similarities that stem from a common genealogy. Here we utilize an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to develop probabilistic climate information, with and without an accounting of inter-model correlations, and use it to estimate climate-related risks to a local water utility in Colorado, U.S. We show that the tail risk of extreme climate changes in both mean precipitation and temperature is underestimated if model correlations are ignored. When coupled with impact models of the hydrology and infrastructure of the water utility, the underestimation of extreme climate changes substantially alters the quantification of risk for water supply shortages by mid-century. We argue that progress in climate change adaptation for local systems requires the recognition that there is less information in multi-model climate ensembles than previously thought. Importantly, adaptation decisions cannot be limited to the spread in one generation of climate models.

  1. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    NASA Technical Reports Server (NTRS)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  2. Health risk assessment of ochratoxin A for all age-sex strata in a market economy

    PubMed Central

    Kuiper-Goodman, T.; Hilts, C.; Billiard, S.M.; Kiparissis, Y.; Richard, I.D.K.; Hayward, S.

    2009-01-01

    In order to manage risk of ochratoxin A (OTA) in foods, we re-evaluated the tolerable daily intake (TDI), derived the negligible cancer risk intake (NCRI), and conducted a probabilistic risk assessment. A new approach was developed to derive ‘usual’ probabilistic exposure in the presence of highly variable occurrence data, such as encountered with low levels of OTA. Canadian occurrence data were used for various raw food commodities or finished foods and were combined with US Department of Agriculture (USDA) food consumption data, which included data on infants and young children. Both variability and uncertainty in input data were considered in the resulting exposure estimates for various age/sex strata. Most people were exposed to OTA on a daily basis. Mean adjusted exposures for all age-sex groups were generally below the NCRI of 4ng OTA kg bw−1, except for 1–4-year-olds as a result of their lower body weight. For children, the major contributors of OTA were wheat-based foods followed by oats, rice, and raisins. Beer, coffee, and wine also contributed to total OTA exposure in older individuals. Predicted exposure to OTA decreased when European Commission maximum limits were applied to the occurrence data. The impact on risk for regular eaters of specific commodities was also examined. PMID:20013446

  3. Robust Decision Making Approach to Managing Water Resource Risks (Invited)

    NASA Astrophysics Data System (ADS)

    Lempert, R.

    2010-12-01

    The IPCC and US National Academies of Science have recommended iterative risk management as the best approach for water management and many other types of climate-related decisions. Such an approach does not rely on a single set of judgments at any one time but rather actively updates and refines strategies as new information emerges. In addition, the approach emphasizes that a portfolio of different types of responses, rather than any single action, often provides the best means to manage uncertainty. Implementing an iterative risk management approach can however prove difficult in actual decision support applications. This talk will suggest that robust decision making (RDM) provides a particularly useful set of quantitative methods for implementing iterative risk management. This RDM approach is currently being used in a wide variety of water management applications. RDM employs three key concepts that differentiate it from most types of probabilistic risk analysis: 1) characterizing uncertainty with multiple views of the future (which can include sets of probability distributions) rather than a single probabilistic best-estimate, 2) employing a robustness rather than an optimality criterion to assess alternative policies, and 3) organizing the analysis with a vulnerability and response option framework, rather than a predict-then-act framework. This talk will summarize the RDM approach, describe its use in several different types of water management applications, and compare the results to those obtained with other methods.

  4. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Identification of water quality management policy of watershed system with multiple uncertain interactions using a multi-level-factorial risk-inference-based possibilistic-probabilistic programming approach.

    PubMed

    Liu, Jing; Li, Yongping; Huang, Guohe; Fu, Haiyan; Zhang, Junlong; Cheng, Guanhui

    2017-06-01

    In this study, a multi-level-factorial risk-inference-based possibilistic-probabilistic programming (MRPP) method is proposed for supporting water quality management under multiple uncertainties. The MRPP method can handle uncertainties expressed as fuzzy-random-boundary intervals, probability distributions, and interval numbers, and analyze the effects of uncertainties as well as their interactions on modeling outputs. It is applied to plan water quality management in the Xiangxihe watershed. Results reveal that a lower probability of satisfying the objective function (θ) as well as a higher probability of violating environmental constraints (q i ) would correspond to a higher system benefit with an increased risk of violating system feasibility. Chemical plants are the major contributors to biological oxygen demand (BOD) and total phosphorus (TP) discharges; total nitrogen (TN) would be mainly discharged by crop farming. It is also discovered that optimistic decision makers should pay more attention to the interactions between chemical plant and water supply, while decision makers who possess a risk-averse attitude would focus on the interactive effect of q i and benefit of water supply. The findings can help enhance the model's applicability and identify a suitable water quality management policy for environmental sustainability according to the practical situations.

  6. Adequacy of the default values for skin surface area used for risk assessment and French anthropometric data by a probabilistic approach.

    PubMed

    Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C

    2017-08-01

    The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  8. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  9. Accounting for pH heterogeneity and variability in modelling human health risks from cadmium in contaminated land.

    PubMed

    Gay, J Rebecca; Korre, Anna

    2009-07-01

    The authors have previously published a methodology which combines quantitative probabilistic human health risk assessment and spatial statistical methods (geostatistics) to produce an assessment, incorporating uncertainty, of risks to human health from exposure to contaminated land. The model assumes a constant soil to plant concentration factor (CF(veg)) when calculating intake of contaminants. This model is modified here to enhance its use in a situation where CF(veg) varies according to soil pH, as is the case for cadmium. The original methodology uses sequential indicator simulation (SIS) to map soil concentration estimates for one contaminant across a site. A real, age-stratified population is mapped across the contaminated area, and intake of soil contaminants by individuals is calculated probabilistically using an adaptation of the Contaminated Land Exposure Assessment (CLEA) model. The proposed improvement involves not only the geostatistical estimation of the contaminant concentration, but also that of soil pH, which in turn leads to a variable CF(veg) estimate which influences the human intake results. The results presented demonstrate that taking pH into account can influence the outcome of the risk assessment greatly. It is proposed that a similar adaptation could be used for other combinations of soil variables which influence CF(veg).

  10. Segmentation of risk structures for otologic surgery using the Probabilistic Active Shape Model (PASM)

    NASA Astrophysics Data System (ADS)

    Becker, Meike; Kirschner, Matthias; Sakas, Georgios

    2014-03-01

    Our research project investigates a multi-port approach for minimally-invasive otologic surgery. For planning such a surgery, an accurate segmentation of the risk structures is crucial. However, the segmentation of these risk structures is a challenging task: The anatomical structures are very small and some have a complex shape, low contrast and vary both in shape and appearance. Therefore, prior knowledge is needed which is why we apply model-based approaches. In the present work, we use the Probabilistic Active Shape Model (PASM), which is a more flexible and specific variant of the Active Shape Model (ASM), to segment the following risk structures: cochlea, semicircular canals, facial nerve, chorda tympani, ossicles, internal auditory canal, external auditory canal and internal carotid artery. For the evaluation we trained and tested the algorithm on 42 computed tomography data sets using leave-one-out tests. Visual assessment of the results shows in general a good agreement of manual and algorithmic segmentations. Further, we achieve a good Average Symmetric Surface Distance while the maximum error is comparatively large due to low contrast at start and end points. Last, we compare the PASM to the standard ASM and show that the PASM leads to a higher accuracy.

  11. Chemical and toxicological characteristics of conventional and low-TSNA moist snuff tobacco products.

    PubMed

    Song, Min-Ae; Marian, Catalin; Brasky, Theodore M; Reisinger, Sarah; Djordjevic, Mirjana; Shields, Peter G

    2016-03-14

    Use of smokeless tobacco products (STPs) is associated with oral cavity cancer and other health risks. Comprehensive analysis for chemical composition and toxicity is needed to compare conventional and newer STPs with lower tobacco-specific nitrosamines (TSNAs) yields. Seven conventional and 12 low-TSNA moist snuff products purchased in the U.S., Sweden, and South Africa were analyzed for 18 chemical constituents (International Agency for Research on Cancer classified carcinogens), pH, nicotine, and free nicotine. Chemicals were compared in each product using Wilcoxon rank-sum test and principle component analysis (PCA). Conventional compared to low-TSNA moist snuff products had higher ammonia, benzo[a]pyrene, cadmium, nickel, nicotine, nitrate, and TSNAs and had lower arsenic in dry weight content and per mg nicotine. Lead and chromium were significantly higher in low-TSNA moist snuff products. PCA showed a clear difference for constituents between conventional and low-TSNA moist snuff products. Differences among products were reduced when considered on a per mg nicotine basis. As one way to contextualize differences in constituent levels, probabilistic lifetime cancer risk was estimated for chemicals included in The University of California's carcinogenic potency database (CPDB). Estimated probabilistic cancer risks were 3.77-fold or 3-fold higher in conventional compared to low-TSNA moist snuff products under dry weight or under per mg nicotine content, respectively. In vitro testing for the STPs indicated low level toxicity and no substantial differences. The comprehensive chemical characterization of both conventional and low-TSNA moist snuff products from this study provides a broader assessment of understanding differences in carcinogenic potential of the products. In addition, the high levels and probabilistic cancer risk estimates for certain chemical constituents of smokeless tobacco products will further inform regulatory decision makers and aid them in their efforts to reduce carcinogen exposure in smokeless tobacco products. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Predicting Progression from Mild Cognitive Impairment to Alzheimer's Dementia Using Clinical, MRI, and Plasma Biomarkers via Probabilistic Pattern Classification

    PubMed Central

    Korolev, Igor O.; Symonds, Laura L.; Bozoki, Andrea C.

    2016-01-01

    Background Individuals with mild cognitive impairment (MCI) have a substantially increased risk of developing dementia due to Alzheimer's disease (AD). In this study, we developed a multivariate prognostic model for predicting MCI-to-dementia progression at the individual patient level. Methods Using baseline data from 259 MCI patients and a probabilistic, kernel-based pattern classification approach, we trained a classifier to distinguish between patients who progressed to AD-type dementia (n = 139) and those who did not (n = 120) during a three-year follow-up period. More than 750 variables across four data sources were considered as potential predictors of progression. These data sources included risk factors, cognitive and functional assessments, structural magnetic resonance imaging (MRI) data, and plasma proteomic data. Predictive utility was assessed using a rigorous cross-validation framework. Results Cognitive and functional markers were most predictive of progression, while plasma proteomic markers had limited predictive utility. The best performing model incorporated a combination of cognitive/functional markers and morphometric MRI measures and predicted progression with 80% accuracy (83% sensitivity, 76% specificity, AUC = 0.87). Predictors of progression included scores on the Alzheimer's Disease Assessment Scale, Rey Auditory Verbal Learning Test, and Functional Activities Questionnaire, as well as volume/cortical thickness of three brain regions (left hippocampus, middle temporal gyrus, and inferior parietal cortex). Calibration analysis revealed that the model is capable of generating probabilistic predictions that reliably reflect the actual risk of progression. Finally, we found that the predictive accuracy of the model varied with patient demographic, genetic, and clinical characteristics and could be further improved by taking into account the confidence of the predictions. Conclusions We developed an accurate prognostic model for predicting MCI-to-dementia progression over a three-year period. The model utilizes widely available, cost-effective, non-invasive markers and can be used to improve patient selection in clinical trials and identify high-risk MCI patients for early treatment. PMID:26901338

  13. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less

  14. Probabilistic human health risk assessment of degradation-related chemical mixtures in heterogeneous aquifers: Risk statistics, hot spots, and preferential channels

    NASA Astrophysics Data System (ADS)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2015-06-01

    The increasing presence of toxic chemicals released in the subsurface has led to a rapid growth of social concerns and the need to develop and employ models that can predict the impact of groundwater contamination on human health risk under uncertainty. Monitored natural attenuation is a common remediation action in many contamination cases. However, natural attenuation can lead to the production of daughter species of distinct toxicity that may pose challenges in pollution management strategies. The actual threat that these contaminants pose to human health depends on the interplay between the complex structure of the geological media and the toxicity of each pollutant byproduct. This work addresses human health risk for chemical mixtures resulting from the sequential degradation of a contaminant (such as a chlorinated solvent) under uncertainty through high-resolution three-dimensional numerical simulations. We systematically investigate the interaction between aquifer heterogeneity, flow connectivity, contaminant injection model, and chemical toxicity in the probabilistic characterization of health risk. We illustrate how chemical-specific travel times control the regime of the expected risk and its corresponding uncertainties. Results indicate conditions where preferential flow paths can favor the reduction of the overall risk of the chemical mixture. The overall human risk response to aquifer connectivity is shown to be nontrivial for multispecies transport. This nontriviality is a result of the interaction between aquifer heterogeneity and chemical toxicity. To quantify the joint effect of connectivity and toxicity in health risk, we propose a toxicity-based Damköhler number. Furthermore, we provide a statistical characterization in terms of low-order moments and the probability density function of the individual and total risks.

  15. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  16. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  17. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO-CORDEX are generally lower than those of GCMs, while the drying trends in precipitation of EURO-CORDEX are smaller than those of GCMs. Climate indices are significantly affected by bias-correction and downscaling process. Our study provides valuable information for selecting climate indices in different regions over Europe.

  18. High Throughput Exposure Prioritization of Chemicals Using a Screening-Level Probabilistic SHEDS-Lite Exposure Model

    EPA Science Inventory

    These novel modeling approaches for screening, evaluating and classifying chemicals based on the potential for biologically-relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. The new modeling approach is derived from the Stocha...

  19. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  20. 75 FR 64366 - Advisory Committee on Reactor Safeguards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ... NUREG/CR- 6997, ``Modeling a Digital Feedwater Control System Using Traditional Probabilistic Risk.../reading-rm/adams.html or http://www.nrc.gov/reading-rm/doc-collections/ACRS/ . Video teleconferencing... line charges and for providing the equipment and facilities that they use to establish the video...

  1. Global probabilistic projections of extreme sea levels show intensification of coastal flood hazard.

    PubMed

    Vousdoukas, Michalis I; Mentaschi, Lorenzo; Voukouvalas, Evangelos; Verlaan, Martin; Jevrejeva, Svetlana; Jackson, Luke P; Feyen, Luc

    2018-06-18

    Global warming is expected to drive increasing extreme sea levels (ESLs) and flood risk along the world's coastlines. In this work we present probabilistic projections of ESLs for the present century taking into consideration changes in mean sea level, tides, wind-waves, and storm surges. Between the year 2000 and 2100 we project a very likely increase of the global average 100-year ESL of 34-76 cm under a moderate-emission-mitigation-policy scenario and of 58-172 cm under a business as usual scenario. Rising ESLs are mostly driven by thermal expansion, followed by contributions from ice mass-loss from glaciers, and ice-sheets in Greenland and Antarctica. Under these scenarios ESL rise would render a large part of the tropics exposed annually to the present-day 100-year event from 2050. By the end of this century this applies to most coastlines around the world, implying unprecedented flood risk levels unless timely adaptation measures are taken.

  2. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operatingmore » plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.« less

  3. A probabilistic safety analysis of incidents in nuclear research reactors.

    PubMed

    Lopes, Valdir Maciel; Agostinho Angelo Sordi, Gian Maria; Moralles, Mauricio; Filho, Tufic Madi

    2012-06-01

    This work aims to evaluate the potential risks of incidents in nuclear research reactors. For its development, two databases of the International Atomic Energy Agency (IAEA) were used: the Research Reactor Data Base (RRDB) and the Incident Report System for Research Reactor (IRSRR). For this study, the probabilistic safety analysis (PSA) was used. To obtain the result of the probability calculations for PSA, the theory and equations in the paper IAEA TECDOC-636 were used. A specific program to analyse the probabilities was developed within the main program, Scilab 5.1.1. for two distributions, Fischer and chi-square, both with the confidence level of 90 %. Using Sordi equations, the maximum admissible doses to compare with the risk limits established by the International Commission on Radiological Protection (ICRP) were obtained. All results achieved with this probability analysis led to the conclusion that the incidents which occurred had radiation doses within the stochastic effects reference interval established by the ICRP-64.

  4. Cadmium Accumulation Risk in Vegetables and Rice in Southern China: Insights from Solid-Solution Partitioning and Plant Uptake Factor.

    PubMed

    Yang, Yang; Wang, Meie; Chen, Weiping; Li, Yanling; Peng, Chi

    2017-07-12

    Solid-solution partitioning coefficient (K d ) and plant uptake factor (PUF) largely determine the solubility and mobility of soil Cd to food crops. A four-year regional investigation was conducted in contaminated vegetable and paddy fields of southern China to quantify the variability in K d and PUF. The distributions of K d and PUF characterizing transfers of Cd from soil to vegetable and rice are probabilistic in nature. Dynamics in soil pH and soil Zn greatly affected the variations of K d . In addition to soil pH, soil organic matter had a major influence on PUF variations in vegetables. Heavy leaching of soil Mn caused a higher Cd accumulation in rice grain. Dietary ingestion of 85.5% of the locally produced vegetable and rice would have adverse health risks, with rice consumption contributing 97.2% of the risk. A probabilistic risk analysis based on derived transfer function reveals the amorphous Mn oxide content exerts a major influence on Cd accumulation in rice in pH conditions below 5.5. Risk estimation and field experiments show that to limit the Cd concentration in rice grains, soil management strategies should include improving the pH and soil Mn concentration to around 6.0 and 345 mg kg -1 , respectively. Our work illustrates that re-establishing a balance in trace elements in soils' labile pool provides an effective risk-based approach for safer crop practices.

  5. Probabilistic risk assessment of exposure to leucomalachite green residues from fish products.

    PubMed

    Chu, Yung-Lin; Chimeddulam, Dalaijamts; Sheen, Lee-Yan; Wu, Kuen-Yuh

    2013-12-01

    To assess the potential risk of human exposure to carcinogenic leucomalachite green (LMG) due to fish consumption, the probabilistic risk assessment was conducted for adolescent, adult and senior adult consumers in Taiwan. The residues of LMG with the mean concentration of 13.378±20.56 μg kg(-1) (BFDA, 2009) in fish was converted into dose, considering fish intake reported for three consumer groups by NAHSIT (1993-1996) and body weight of an average individual of the group. The lifetime average and high 95th percentile dietary intakes of LMG from fish consumption for Taiwanese consumers were estimated at up to 0.0135 and 0.0451 μg kg-bw(-1) day(-1), respectively. Human equivalent dose (HED) of 2.875 mg kg-bw(-1) day(-1) obtained from a lower-bound benchmark dose (BMDL10) in mice by interspecies extrapolation was linearly extrapolated to oral cancer slope factor (CSF) of 0.035 (mgkg-bw(-1)day(-1))(-1) for humans. Although, the assumptions and methods are different, the results of lifetime cancer risk varying from 3×10(-7) to 1.6×10(-6) were comparable to those of margin of exposures (MOEs) varying from 410,000 to 4,800,000. In conclusions, Taiwanese fish consumers with the 95th percentile LADD of LMG have greater risk of liver cancer and need to an action of risk management in Taiwan. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Naples between two fires: eruptive scenarios for the next eruptions by an integrated volcanological-probabilistic approach.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.

    2009-04-01

    Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.

  7. Implications of Big Data Analytics on Population Health Management.

    PubMed

    Bradley, Paul S

    2013-09-01

    As healthcare providers transition to outcome-based reimbursements, it is imperative that they make the transition to population health management to stay viable. Providers already have big data assets in the form of electronic health records and financial billing system. Integrating these disparate sources together in patient-centered datasets provides the foundation for probabilistic modeling of their patient populations. These models are the core technology to compute and track the health and financial risk status of the patient population being served. We show how the probabilistic formulation allows for straightforward, early identification of a change in health and risk status. Knowing when a patient is likely to shift to a less healthy, higher risk category allows the provider to intervene to avert or delay the shift. These automated, proactive alerts are critical in maintaining and improving the health of a population of patients. We discuss results of leveraging these models with an urban healthcare provider to track and monitor type 2 diabetes patients. When intervention outcome data are available, data mining and predictive modeling technology are primed to recommend the best type of intervention (prescriptions, physical therapy, discharge protocols, etc.) with the best likely outcome.

  8. Estimating earthquake-induced failure probability and downtime of critical facilities.

    PubMed

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.

  9. Station Blackout: A case study in the interaction of mechanistic and probabilistic safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; Diego Mandelli; Cristian Rabiti

    2013-11-01

    The ability to better characterize and quantify safety margins is important to improved decision making about nuclear power plant design, operation, and plant life extension. As research and development (R&D) in the light-water reactor (LWR) Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway R&D is to support plant decisions for risk-informed margin management with the aim tomore » improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario.« less

  10. Pertuzumab for the Neoadjuvant Treatment of Early-Stage HER2-Positive Breast Cancer: An Evidence Review Group Perspective of a NICE Single Technology Appraisal.

    PubMed

    Squires, Hazel; Pandor, Abdullah; Thokala, Praveen; Stevens, John W; Kaltenthaler, Eva; Clowes, Mark; Coleman, Robert; Wyld, Lynda

    2018-01-01

    As part of its single technology appraisal process, the National Institute for Health and Care Excellence invited the manufacturer of pertuzumab (Perjeta ® ; Roche Products Limited) to submit evidence of its clinical and cost- effectiveness for the neoadjuvant treatment of women with high-risk, early-stage, HER2-positive breast cancer when used in combination with trastuzumab and chemotherapy. High-risk women included those with locally advanced (including inflammatory) breast cancer and women with high-risk early-stage breast cancer (classified as T2/3 or N1). The School of Health and Related Research Technology Appraisal Group at the University of Sheffield was commissioned to act as the independent Evidence Review Group. This article presents the critical review of the company's submission by the Evidence Review Group and the outcome of the National Institute for Health and Care Excellence guidance. The clinical data were mainly taken from a phase II, randomised, open-label, active controlled study (NeoSphere), which reported a significant advantage in terms of pathological complete response rates of pertuzumab in combination with trastuzumab and chemotherapy, compared with trastuzumab alone with chemotherapy (45.8 vs. 29.0%, p = 0.0141). The company did not make any indirect comparisons. A meta-analysis of 12 neoadjuvant studies investigating the relationship between pathological complete response and event-free survival was used to extrapolate the outcomes reported in the NeoSphere study. A cardiac safety study (TRYPHAENA) demonstrated the safety of pertuzumab. The company undertook a model-based economic evaluation of neoadjuvant pertuzumab plus trastuzumab and docetaxel compared with neoadjuvant trastuzumab and docetaxel over a lifetime horizon from the National Health Service and Personal Social Services perspective. The probabilistic incremental cost-effectiveness ratio was estimated to be £20,104 per quality-adjusted life-year gained for pertuzumab alongside trastuzumab and docetaxel compared with trastuzumab and docetaxel, which was revised to £21,869 per quality-adjusted life-year gained following the clarification process. The Evidence Review Group corrected an error in the digitisation of the survivor functions and modified the clinically inappropriate assumption that recurrence is zero after 7 years. The Evidence Review Group's probabilistic base case was £23,962 per quality-adjusted life-year gained. During the appraisal, to mitigate the uncertainties associated with the evidence, the company offered a patient access scheme, which led to the National Institute for Health and Care Excellence Appraisal Committee recommending pertuzumab in this patient group, subject to the company providing the agreed discount in the patient access scheme.

  11. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  12. Near Earth Asteroid Characterization for Threat Assessment

    NASA Technical Reports Server (NTRS)

    Dotson, Jessie; Mathias, Donovan; Wheeler, Lorien; Wooden, Diane; Bryson, Kathryn; Ostrowski, Daniel

    2017-01-01

    Physical characteristics of NEAs are an essential input to modeling behavior during atmospheric entry and to assess the risk of impact but determining these properties requires a non-trivial investment of time and resources. The characteristics relevant to these models include size, density, strength and ablation coefficient. Some of these characteristics cannot be directly measured, but rather must be inferred from related measurements of asteroids and/or meteorites. Furthermore, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. The Asteroid Threat Assessment Project at NASA Ames Research Center has developed a probabilistic asteroid impact risk (PAIR) model in order to assess the risk of asteroid impact. Our PAIR model and its use to develop probability distributions of impact risk are discussed in other contributions to PDC 2017 (e.g., Mathias et al.). Here we utilize PAIR to investigate which NEA characteristics are important for assessing the impact threat by investigating how changes in these characteristics alter the damage predicted by PAIR. We will also provide an assessment of the current state of knowledge of the NEA characteristics of importance for asteroid threat assessment. The relative importance of different properties as identified using PAIR will be combined with our assessment of the current state of knowledge to identify potential high impact investigations. In addition, we will discuss an ongoing effort to collate the existing measurements of NEA properties of interest to the planetary defense community into a readily accessible database.

  13. Adequacy assessment of composite generation and transmission systems incorporating wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Gao, Yi

    The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.

  14. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once calibrated, the model was able to hindcast the damage produced in Santa Cruz Harbor during the 2010 Chile and 2011 Japan events. Results of the Santa Cruz analysis will be presented and discussed.

  15. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    NASA Technical Reports Server (NTRS)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  16. Dynamic Cost Risk Assessment for Controlling the Cost of Naval Vessels

    DTIC Science & Technology

    2008-04-23

    for each individual RRA at the start of the project are depicted in Figures 2a and 2b, respectively. The PDFs are multimodal and cannot be...underestimates cost. 7 Probabilistic cost analysis A physician metaphor Adapted from Yacov Y. Haimes, NPS 2007 8 Dynamic cost risk management A physician... metaphor Adapted from Yacov Y. Haimes, NPS 2007 9 Sources of cost uncertainty Macroscopic analysis Economic, Materials & Labor, Learning rates

  17. An Assessment of the Effectiveness of Air Force Risk Management Practices in Program Acquisition Using Survey Instrument Analysis

    DTIC Science & Technology

    2015-06-18

    Engineering Effectiveness Survey. CMU/SEI-2012-SR-009. Carnegie Mellon University. November 2012. Field, Andy. Discovering Statistics Using SPSS , 3rd...enough into the survey to begin answering questions on risk practices. All of the data statistical analysis will be performed using SPSS . Prior to...probabilistically using distributions for likelihood and impact. Statistical methods like Monte Carlo can more comprehensively evaluate the cost and

  18. Threatened species and the potential loss of phylogenetic diversity: conservation scenarios based on estimated extinction probabilities and phylogenetic risk analysis.

    PubMed

    Faith, Daniel P

    2008-12-01

    New species conservation strategies, including the EDGE of Existence (EDGE) program, have expanded threatened species assessments by integrating information about species' phylogenetic distinctiveness. Distinctiveness has been measured through simple scores that assign shared credit among species for evolutionary heritage represented by the deeper phylogenetic branches. A species with a high score combined with a high extinction probability receives high priority for conservation efforts. Simple hypothetical scenarios for phylogenetic trees and extinction probabilities demonstrate how such scoring approaches can provide inefficient priorities for conservation. An existing probabilistic framework derived from the phylogenetic diversity measure (PD) properly captures the idea of shared responsibility for the persistence of evolutionary history. It avoids static scores, takes into account the status of close relatives through their extinction probabilities, and allows for the necessary updating of priorities in light of changes in species threat status. A hypothetical phylogenetic tree illustrates how changes in extinction probabilities of one or more species translate into changes in expected PD. The probabilistic PD framework provided a range of strategies that moved beyond expected PD to better consider worst-case PD losses. In another example, risk aversion gave higher priority to a conservation program that provided a smaller, but less risky, gain in expected PD. The EDGE program could continue to promote a list of top species conservation priorities through application of probabilistic PD and simple estimates of current extinction probability. The list might be a dynamic one, with all the priority scores updated as extinction probabilities change. Results of recent studies suggest that estimation of extinction probabilities derived from the red list criteria linked to changes in species range sizes may provide estimated probabilities for many different species. Probabilistic PD provides a framework for single-species assessment that is well-integrated with a broader measurement of impacts on PD owing to climate change and other factors.

  19. Probabilistic Neighborhood-Based Data Collection Algorithms for 3D Underwater Acoustic Sensor Networks

    PubMed Central

    Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo

    2017-01-01

    Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency. PMID:28208735

  20. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

Top