Sample records for dynamic probabilistic risk

  1. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  2. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  3. Mastodon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh; Veeraraghavan, Swetha; Bolisetti, Chandrakanth

    MASTODON has the capability to model stochastic nonlinear soil-structure interaction (NLSSI) in a dynamic probabilistic risk assessment framework. The NLSSI simulations include structural dynamics, time integration, dynamic porous media flow, nonlinear hysteretic soil constitutive models, geometric nonlinearities (gapping, sliding, and uplift). MASTODON is also the MOOSE based master application for dynamic PRA of external hazards.

  4. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  5. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  6. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  8. Dynamic Cost Risk Assessment for Controlling the Cost of Naval Vessels

    DTIC Science & Technology

    2008-04-23

    for each individual RRA at the start of the project are depicted in Figures 2a and 2b, respectively. The PDFs are multimodal and cannot be...underestimates cost. 7 Probabilistic cost analysis A physician metaphor Adapted from Yacov Y. Haimes, NPS 2007 8 Dynamic cost risk management A physician... metaphor Adapted from Yacov Y. Haimes, NPS 2007 9 Sources of cost uncertainty Macroscopic analysis Economic, Materials & Labor, Learning rates

  9. Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication

    NASA Astrophysics Data System (ADS)

    Thompson, Kimberly M.

    Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.

  10. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    NASA Technical Reports Server (NTRS)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  11. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    PubMed Central

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  12. Probabilistic assessment of roadway departure risk in a curve

    NASA Astrophysics Data System (ADS)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  13. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  14. Environmental risk assessment of engineered nano-SiO2 , nano iron oxides, nano-CeO2 , nano-Al2 O3 , and quantum dots.

    PubMed

    Wang, Yan; Nowack, Bernd

    2018-05-01

    Many research studies have endeavored to investigate the ecotoxicological hazards of engineered nanomaterials (ENMs). However, little is known regarding the actual environmental risks of ENMs, combining both hazard and exposure data. The aim of the present study was to quantify the environmental risks for nano-Al 2 O 3 , nano-SiO 2 , nano iron oxides, nano-CeO 2 , and quantum dots by comparing the predicted environmental concentrations (PECs) with the predicted-no-effect concentrations (PNECs). The PEC values of these 5 ENMs in freshwaters in 2020 for northern Europe and southeastern Europe were taken from a published dynamic probabilistic material flow analysis model. The PNEC values were calculated using probabilistic species sensitivity distribution (SSD). The order of the PNEC values was quantum dots < nano-CeO 2  < nano iron oxides < nano-Al 2 O 3  < nano-SiO 2 . The risks posed by these 5 ENMs were demonstrated to be in the reverse order: nano-Al 2 O 3  > nano-SiO 2  > nano iron oxides > nano-CeO 2  > quantum dots. However, all risk characterization values are 4 to 8 orders of magnitude lower than 1, and no risk was therefore predicted for any of the investigated ENMs at the estimated release level in 2020. Compared to static models, the dynamic material flow model allowed us to use PEC values based on a more complex parameterization, considering a dynamic input over time and time-dependent release of ENMs. The probabilistic SSD approach makes it possible to include all available data to estimate hazards of ENMs by considering the whole range of variability between studies and material types. The risk-assessment approach is therefore able to handle the uncertainty and variability associated with the collected data. The results of the present study provide a scientific foundation for risk-based regulatory decisions of the investigated ENMs. Environ Toxicol Chem 2018;37:1387-1395. © 2018 SETAC. © 2018 SETAC.

  15. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  16. Probabilistic risk assessment of the effect of acidified seawater on development stages of sea urchin (Strongylocentrotus droebachiensis).

    PubMed

    Chen, Wei-Yu; Lin, Hsing-Chieh

    2018-05-01

    Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.

  17. Management of Risk and Uncertainty in Systems Acquisition: Proceedings of the 1983 Defense Risk and Uncertainty Workshop Held at Fort Belvoir, Virginia on 13-15 July 1983

    DTIC Science & Technology

    1983-07-15

    categories, however, represent the reality in major acquisition and are often overlooked. Although Figure 1 does not reflect tne dynamics and Interactions...networking and improved computer capabili- ties probabilistic network simulation became a reality . The Naval Sea Systems Command became involved in...reasons for using the WBS are plain: 1. Virtually all risk-prone activities are performed by the contractor, not Government. Government is responsible

  18. Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie

    2006-01-01

    A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.

  19. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  20. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of these models. Details of the development of the mathematical risk model are presented. This includes discussion of the processes included in the model and the identification of significant interprocess interactions. This is followed by analysis of the model that demonstrates that its dynamical evolution displays characteristics that have been observed at commercially operating plants. The model is analyzed using the previously described techniques from dynamical systems theory. From this analysis, several significant insights are obtained with respect to the effective control of nuclear safety risk. Finally, we present conclusions and recommendations for further research.

  1. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 5: Auxiliary shuttle risk analyses

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.

  2. Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.

    PubMed

    Avramenko, M; Bolyatko, V; Kosterev, V

    2005-01-01

    Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.

  3. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  4. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  5. Terminal Model Of Newtonian Dynamics

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1994-01-01

    Paper presents study of theory of Newtonian dynamics of terminal attractors and repellers, focusing on issues of reversibility vs. irreversibility and deterministic evolution vs. probabilistic or chaotic evolution of dynamic systems. Theory developed called "terminal dynamics" emphasizes difference between it and classical Newtonian dynamics. Also holds promise for explaining irreversibility, unpredictability, probabilistic behavior, and chaos in turbulent flows, in thermodynamic phenomena, and in other dynamic phenomena and systems.

  6. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models.

  7. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models. PMID:29062288

  8. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  9. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  10. Landslide Hazard Probability Derived from Inherent and Dynamic Determinants

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan

    2016-04-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.

  11. Probabilistic approach for earthquake scenarios in the Marmara region from dynamic rupture simulations

    NASA Astrophysics Data System (ADS)

    Aochi, Hideo

    2014-05-01

    The Marmara region (Turkey) along the North Anatolian fault is known as a high potential of large earthquakes in the next decades. For the purpose of seismic hazard/risk evaluation, kinematic and dynamic source models have been proposed (e.g. Oglesby and Mai, GJI, 2012). In general, the simulated earthquake scenarios depend on the hypothesis and cannot be verified before the expected earthquake. We then introduce a probabilistic insight to give the initial/boundary conditions to statistically analyze the simulated scenarios. We prepare different fault geometry models, tectonic loading and hypocenter locations. We keep the same framework of the simulation procedure as the dynamic rupture process of the adjacent 1999 Izmit earthquake (Aochi and Madariaga, BSSA, 2003), as the previous models were able to reproduce the seismological/geodetic aspects of the event. Irregularities in fault geometry play a significant role to control the rupture progress, and a relatively large change in geometry may work as barriers. The variety of the simulate earthquake scenarios should be useful for estimating the variety of the expected ground motion.

  12. Non-Deterministic Dynamic Instability of Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2004-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.

  13. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  14. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. JOe; Ronald L. Boring

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less

  15. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  16. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less

  17. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper representsmore » an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation environment such as RELAP-7. • Identify the risk-significant passive components, their failure modes and anticipated rates of degradation • Incorporate surveillance and maintenance activities and their effects into the plant state and into component aging progress. • Asses aging affects in a dynamic simulation environment 1. C. L. SMITH, V. N. SHAH, T. KAO, G. APOSTOLAKIS, “Incorporating Ageing Effects into Probabilistic Risk Assessment –A Feasibility Study Utilizing Reliability Physics Models,” NUREG/CR-5632, USNRC, (2001). 2. T. ALDEMIR, “A Survey of Dynamic Methodologies for Probabilistic Safety Assessment of Nuclear Power Plants, Annals of Nuclear Energy, 52, 113-124, (2013). 3. C. RABITI, A. ALFONSI, J. COGLIATI, D. MANDELLI and R. KINOSHITA “Reactor Analysis and Virtual Control Environment (RAVEN) FY12 Report,” INL/EXT-12-27351, (2012). 4. D. ANDERS et.al, "RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7," INL/EXT-12-25924, (2012).« less

  18. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  19. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  20. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  1. A global empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  2. An empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  3. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  4. WIPCast: Probabilistic Forecasting for Aviation Decision Aid Applications

    DTIC Science & Technology

    2011-06-01

    traders, or families planning an outing – manage weather-related risk. By quantifying risk , probabilistic forecasting enables optimization of actions via...confidence interval to the user’s risk tolerance helps drive highly effective and innovative decision support mechanisms for visually quantifying risk for

  5. Probabilistic characterization of wind turbine blades via aeroelasticity and spinning finite element formulation

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, R. Andrew

    2012-04-01

    Wind energy is an increasingly important component of this nation's renewable energy portfolio, however safe and economical wind turbine operation is a critical need to ensure continued adoption. Safe operation of wind turbine structures requires not only information regarding their condition, but their operational environment. Given the difficulty inherent in SHM processes for wind turbines (damage detection, location, and characterization), some uncertainty in conditional assessment is expected. Furthermore, given the stochastic nature of the loading on turbine structures, a probabilistic framework is appropriate to characterize their risk of failure at a given time. Such information will be invaluable to turbine controllers, allowing them to operate the structures within acceptable risk profiles. This study explores the characterization of the turbine loading and response envelopes for critical failure modes of the turbine blade structures. A framework is presented to develop an analytical estimation of the loading environment (including loading effects) based on the dynamic behavior of the blades. This is influenced by behaviors including along and across-wind aero-elastic effects, wind shear gradient, tower shadow effects, and centrifugal stiffening effects. The proposed solution includes methods that are based on modal decomposition of the blades and require frequent updates to the estimated modal properties to account for the time-varying nature of the turbine and its environment. The estimated demand statistics are compared to a code-based resistance curve to determine a probabilistic estimate of the risk of blade failure given the loading environment.

  6. Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves

    NASA Astrophysics Data System (ADS)

    Rošt'áková, Zuzana; Rosipal, Roman

    2018-02-01

    Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.

  7. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  8. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population readout of a set of probabilistic spatial computations. PMID:27792723

  9. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  10. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  12. Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions

    EPA Pesticide Factsheets

    General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.

  13. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    NASA Technical Reports Server (NTRS)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  14. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  15. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  16. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  18. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  20. Extravehicular Activity Probabilistic Risk Assessment Overview for Thermal Protection System Repair on the Hubble Space Telescope Servicing Mission

    NASA Technical Reports Server (NTRS)

    Bigler, Mark; Canga, Michael A.; Duncan, Gary

    2010-01-01

    The Shuttle Program initiated an Extravehicular Activity (EVA) Probabilistic Risk Assessment (PRA) to assess the risks associated with performing a Shuttle Thermal Protection System (TPS) repair during the Space Transportation System (STS)-125 Hubble repair mission as part of risk trades between TPS repair and crew rescue.

  1. Envisioning Nano Release Dynamics in a Changing World: Using Dynamic Probabilistic Modeling to Assess Future Environmental Emissions of Engineered Nanomaterials.

    PubMed

    Sun, Tian Yin; Mitrano, Denise M; Bornhöft, Nikolaus A; Scheringer, Martin; Hungerbühler, Konrad; Nowack, Bernd

    2017-03-07

    The need for an environmental risk assessment for engineered nanomaterials (ENM) necessitates the knowledge about their environmental emissions. Material flow models (MFA) have been used to provide predicted environmental emissions but most current nano-MFA models consider neither the rapid development of ENM production nor the fact that a large proportion of ENM are entering an in-use stock and are released from products over time (i.e., have a lag phase). Here we use dynamic probabilistic material flow modeling to predict scenarios of the future flows of four ENM (nano-TiO 2 , nano-ZnO, nano-Ag and CNT) to environmental compartments and to quantify their amounts in (temporary) sinks such as the in-use stock and ("final") environmental sinks such as soil and sediment. In these scenarios, we estimate likely future amounts if the use and distribution of ENM in products continues along current trends (i.e., a business-as-usual approach) and predict the effect of hypothetical trends in the market development of nanomaterials, such as the emergence of a new widely used product or the ban on certain substances, on the flows of nanomaterials to the environment in years to come. We show that depending on the scenario and the product type affected, significant changes of the flows occur over time, driven by the growth of stocks and delayed release dynamics.

  2. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  3. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  4. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  5. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  6. Cadmium Accumulation Risk in Vegetables and Rice in Southern China: Insights from Solid-Solution Partitioning and Plant Uptake Factor.

    PubMed

    Yang, Yang; Wang, Meie; Chen, Weiping; Li, Yanling; Peng, Chi

    2017-07-12

    Solid-solution partitioning coefficient (K d ) and plant uptake factor (PUF) largely determine the solubility and mobility of soil Cd to food crops. A four-year regional investigation was conducted in contaminated vegetable and paddy fields of southern China to quantify the variability in K d and PUF. The distributions of K d and PUF characterizing transfers of Cd from soil to vegetable and rice are probabilistic in nature. Dynamics in soil pH and soil Zn greatly affected the variations of K d . In addition to soil pH, soil organic matter had a major influence on PUF variations in vegetables. Heavy leaching of soil Mn caused a higher Cd accumulation in rice grain. Dietary ingestion of 85.5% of the locally produced vegetable and rice would have adverse health risks, with rice consumption contributing 97.2% of the risk. A probabilistic risk analysis based on derived transfer function reveals the amorphous Mn oxide content exerts a major influence on Cd accumulation in rice in pH conditions below 5.5. Risk estimation and field experiments show that to limit the Cd concentration in rice grains, soil management strategies should include improving the pH and soil Mn concentration to around 6.0 and 345 mg kg -1 , respectively. Our work illustrates that re-establishing a balance in trace elements in soils' labile pool provides an effective risk-based approach for safer crop practices.

  7. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  8. 76 FR 55717 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... Subcommittee on Reliability and Probabilistic Risk Assessment The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on September 20, 2011, Room T-2B1, 11545 Rockville Pike... Memorandum on Modifying the Risk-Informed Regulatory Guidance for New Reactors. The Subcommittee will hear...

  9. 75 FR 18205 - Notice of Peer Review Meeting for the External Peer Review Drafts of Two Documents on Using...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...

  10. 76 FR 28102 - Notice of Issuance of Regulatory Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-13

    ..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...

  11. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  12. Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  13. Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  14. Probabilistic DHP adaptive critic for nonlinear stochastic control systems.

    PubMed

    Herzallah, Randa

    2013-06-01

    Following the recently developed algorithms for fully probabilistic control design for general dynamic stochastic systems (Herzallah & Káarnáy, 2011; Kárný, 1996), this paper presents the solution to the probabilistic dual heuristic programming (DHP) adaptive critic method (Herzallah & Káarnáy, 2011) and randomized control algorithm for stochastic nonlinear dynamical systems. The purpose of the randomized control input design is to make the joint probability density function of the closed loop system as close as possible to a predetermined ideal joint probability density function. This paper completes the previous work (Herzallah & Káarnáy, 2011; Kárný, 1996) by formulating and solving the fully probabilistic control design problem on the more general case of nonlinear stochastic discrete time systems. A simulated example is used to demonstrate the use of the algorithm and encouraging results have been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Spatial forecasting of disease risk and uncertainty

    USGS Publications Warehouse

    De Cola, L.

    2002-01-01

    Because maps typically represent the value of a single variable over 2-dimensional space, cartographers must simplify the display of multiscale complexity, temporal dynamics, and underlying uncertainty. A choropleth disease risk map based on data for polygonal regions might depict incidence (cases per 100,000 people) within each polygon for a year but ignore the uncertainty that results from finer-scale variation, generalization, misreporting, small numbers, and future unknowns. In response to such limitations, this paper reports on the bivariate mapping of data "quantity" and "quality" of Lyme disease forecasts for states of the United States. Historical state data for 1990-2000 are used in an autoregressive model to forecast 2001-2010 disease incidence and a probability index of confidence, each of which is then kriged to provide two spatial grids representing continuous values over the nation. A single bivariate map is produced from the combination of the incidence grid (using a blue-to-red hue spectrum), and a probabilistic confidence grid (used to control the saturation of the hue at each grid cell). The resultant maps are easily interpretable, and the approach may be applied to such problems as detecting unusual disease occurences, visualizing past and future incidence, and assembling a consistent regional disease atlas showing patterns of forecasted risks in light of probabilistic confidence.

  16. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  17. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  18. [Theoretical evaluation of the risk of decompression illness during simulated extravehicular activity].

    PubMed

    Nikolaev, V P

    2008-01-01

    Theoretical analysis of the risk of decompression illness (DI) during extravehicular activity following the Russian and NASA decompression protocols (D-R and D-US, respectively) was performed. In contrast to the tradition approach to decompression stress evaluation by the factor of tissue supersaturation with nitrogen, our probabilistic theory of decompression safety provides a completely reasoned evaluation and comparison of the levels of hazard of these decompression protocols. According to this theory, the function of cumulative DI risk is equal to the sum of functions of cumulative risk of lesion of all body tissues by gas bubbles and their supersaturation by solute gases. Based on modeling of dynamics of these functions, growth of the DI cumulative risk in the course of D-R and D-US follows essentially similar trajectories within the time-frame of up to 330 minutes. However, further extension of D-US but not D-R raises the risk of DI drastically.

  19. Probabilistic reversal learning is impaired in Parkinson's disease

    PubMed Central

    Peterson, David A.; Elliott, Christian; Song, David D.; Makeig, Scott; Sejnowski, Terrence J.; Poizner, Howard

    2009-01-01

    In many everyday settings, the relationship between our choices and their potentially rewarding outcomes is probabilistic and dynamic. In addition, the difficulty of the choices can vary widely. Although a large body of theoretical and empirical evidence suggests that dopamine mediates rewarded learning, the influence of dopamine in probabilistic and dynamic rewarded learning remains unclear. We adapted a probabilistic rewarded learning task originally used to study firing rates of dopamine cells in primate substantia nigra pars compacta (Morris et al. 2006) for use as a reversal learning task with humans. We sought to investigate how the dopamine depletion in Parkinson's disease (PD) affects probabilistic reward learning and adaptation to a reversal in reward contingencies. Over the course of 256 trials subjects learned to choose the more favorable from among pairs of images with small or large differences in reward probabilities. During a subsequent otherwise identical reversal phase, the reward probability contingencies for the stimuli were reversed. Seventeen Parkinson's disease (PD) patients of mild to moderate severity were studied off of their dopaminergic medications and compared to 15 age-matched controls. Compared to controls, PD patients had distinct pre- and post-reversal deficiencies depending upon the difficulty of the choices they had to learn. The patients also exhibited compromised adaptability to the reversal. A computational model of the subjects’ trial-by-trial choices demonstrated that the adaptability was sensitive to the gain with which patients weighted pre-reversal feedback. Collectively, the results implicate the nigral dopaminergic system in learning to make choices in environments with probabilistic and dynamic reward contingencies. PMID:19628022

  20. ASSESSMENT OF DYNAMIC PRA TECHNIQUES WITH INDUSTRY AVERAGE COMPONENT PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yadav, Vaibhav; Agarwal, Vivek; Gribok, Andrei V.

    In the nuclear industry, risk monitors are intended to provide a point-in-time estimate of the system risk given the current plant configuration. Current risk monitors are limited in that they do not properly take into account the deteriorating states of plant equipment, which are unit-specific. Current approaches to computing risk monitors use probabilistic risk assessment (PRA) techniques, but the assessment is typically a snapshot in time. Living PRA models attempt to address limitations of traditional PRA models in a limited sense by including temporary changes in plant and system configurations. However, information on plant component health are not considered. Thismore » often leaves risk monitors using living PRA models incapable of conducting evaluations with dynamic degradation scenarios evolving over time. There is a need to develop enabling approaches to solidify risk monitors to provide time and condition-dependent risk by integrating traditional PRA models with condition monitoring and prognostic techniques. This paper presents estimation of system risk evolution over time by integrating plant risk monitoring data with dynamic PRA methods incorporating aging and degradation. Several online, non-destructive approaches have been developed for diagnosing plant component conditions in nuclear industry, i.e., condition indication index, using vibration analysis, current signatures, and operational history [1]. In this work the component performance measures at U.S. commercial nuclear power plants (NPP) [2] are incorporated within the various dynamic PRA methodologies [3] to provide better estimates of probability of failures. Aging and degradation is modeled within the Level-1 PRA framework and is applied to several failure modes of pumps and can be extended to a range of components, viz. valves, generators, batteries, and pipes.« less

  1. Finite element probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacvarov, D.C.

    1981-01-01

    A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less

  2. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  3. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  4. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  5. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  6. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  7. A mediation model to explain decision making under conditions of risk among adolescents: the role of fluid intelligence and probabilistic reasoning.

    PubMed

    Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina

    2014-01-01

    This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.

  8. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  9. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  10. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  11. Diversity in times of adversity: probabilistic strategies in microbial survival games.

    PubMed

    Wolf, Denise M; Vazirani, Vijay V; Arkin, Adam P

    2005-05-21

    Population diversification strategies are ubiquitous among microbes, encompassing random phase-variation (RPV) of pathogenic bacteria, viral latency as observed in some bacteriophage and HIV, and the non-genetic diversity of bacterial stress responses. Precise conditions under which these diversification strategies confer an advantage have not been well defined. We develop a model of population growth conditioned on dynamical environmental and cellular states. Transitions among cellular states, in turn, may be biased by possibly noisy readings of the environment from cellular sensors. For various types of environmental dynamics and cellular sensor capability, we apply game-theoretic analysis to derive the evolutionarily stable strategy (ESS) for an organism and determine when that strategy is diversification. We find that: (1) RPV, effecting a sort of Parrondo paradox wherein random alternations between losing strategies produce a winning strategy, is selected when transitions between different selective environments cannot be sensed, (2) optimal RPV cell switching rates are a function of environmental lifecycle asymmetries and environmental autocorrelation, (3) probabilistic diversification upon entering a new environment is selected when sensors can detect environmental transitions but have poor precision in identifying new environments, and (4) in the presence of excess additive noise, low-pass filtering is required for evolutionary stability. We show that even when RPV is not the ESS, it may minimize growth rate variance and the risk of extinction due to 'unlucky' environmental dynamics.

  12. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is considered. Many tasks in computational materials science can be posed as optimization problems. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The last approach is concerned with the generation of realizations of materials with specified but limited microstructural information: an intriguing inverse problem of both fundamental and practical importance. Computational models based upon the theories of molecular dynamics or quantum mechanics would enable the prediction and modification of fundamental materials properties. This problem is solved using deterministic and stochastic optimization techniques. The main optimization approaches in the frame of the EU project "Superlight-weight thermal protection system for space application" are discussed. Optimization approach to the alloys for obtaining materials with required properties using modeling techniques and experimental data will be also considered. This report is supported by the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)"

  13. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  14. Systems Reliability Framework for Surface Water Sustainability and Risk Management

    NASA Astrophysics Data System (ADS)

    Myers, J. R.; Yeghiazarian, L.

    2016-12-01

    With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability. With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability.

  15. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describesmore » current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.« less

  16. System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian

    In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less

  17. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  18. 76 FR 11525 - Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA), Room T-2B1, 11545 Rockville Pike, Rockville, Maryland...

  19. 76 FR 22934 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on May 11, 2011, Room T-2B3, 11545...

  20. 76 FR 71609 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-18

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on December 14, 2011, Room T-2B3...

  1. Adolescents' Heightened Risk-Seeking in a Probabilistic Gambling Task

    ERIC Educational Resources Information Center

    Burnett, Stephanie; Bault, Nadege; Coricelli, Giorgio; Blakemore, Sarah-Jayne

    2010-01-01

    This study investigated adolescent males' decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (aged 12-15), 20 mid-adolescents (aged 15-18) and 17…

  2. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part II: Inundation Modelling and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.

    2013-09-01

    Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.

  3. Assessing and Mitigating Hurricane Storm Surge Risk in a Changing Environment

    NASA Astrophysics Data System (ADS)

    Lin, N.; Shullman, E.; Xian, S.; Feng, K.

    2017-12-01

    Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).

  4. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  5. Probabilistic Description of the Hydrologic Risk in Agriculture

    NASA Astrophysics Data System (ADS)

    Vico, G.; Porporato, A. M.

    2011-12-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climatic variability on agroecosystems productivity and profitability, at the expenses of increasing water requirements for irrigation purposes. Optimizing water allocation for crop yield preservation and sustainable development needs to account for hydro-climatic variability, which is by far the main source of uncertainty affecting crop yields and irrigation water requirements. In this contribution, a widely applicable probabilistic framework is proposed to quantitatively define the hydrologic risk of yield reduction for both rainfed and irrigated agriculture. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season. Based on these linkages, long-term and real-time yield reduction risk indices are defined as a function of climate, soil and crop parameters, as well as irrigation strategy. The former risk index is suitable for long-term irrigation strategy assessment and investment planning, while the latter risk index provides a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season. This probabilistic framework allows also assessing the impact of limited water availability on crop yield, thus guiding the optimal allocation of water resources for human and environmental needs. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios, thus facilitating the assessment of the impact of increasingly frequent water shortages on agricultural productivity, profitability, and sustainability.

  6. The benefits of probabilistic exposure assessment: three case studies involving contaminated air, water, and soil.

    PubMed

    Finley, B; Paustenbach, D

    1994-02-01

    Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.

  7. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the relevance of accounting for the full range of flood events and their relation to both potential damages and benefits in risk assessments. Management measures may thus be designed to reflect local contexts and support benefits of natural hydrologic processes, while minimizing flood damage.

  8. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  9. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  10. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2007-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  11. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  12. Threatened species and the potential loss of phylogenetic diversity: conservation scenarios based on estimated extinction probabilities and phylogenetic risk analysis.

    PubMed

    Faith, Daniel P

    2008-12-01

    New species conservation strategies, including the EDGE of Existence (EDGE) program, have expanded threatened species assessments by integrating information about species' phylogenetic distinctiveness. Distinctiveness has been measured through simple scores that assign shared credit among species for evolutionary heritage represented by the deeper phylogenetic branches. A species with a high score combined with a high extinction probability receives high priority for conservation efforts. Simple hypothetical scenarios for phylogenetic trees and extinction probabilities demonstrate how such scoring approaches can provide inefficient priorities for conservation. An existing probabilistic framework derived from the phylogenetic diversity measure (PD) properly captures the idea of shared responsibility for the persistence of evolutionary history. It avoids static scores, takes into account the status of close relatives through their extinction probabilities, and allows for the necessary updating of priorities in light of changes in species threat status. A hypothetical phylogenetic tree illustrates how changes in extinction probabilities of one or more species translate into changes in expected PD. The probabilistic PD framework provided a range of strategies that moved beyond expected PD to better consider worst-case PD losses. In another example, risk aversion gave higher priority to a conservation program that provided a smaller, but less risky, gain in expected PD. The EDGE program could continue to promote a list of top species conservation priorities through application of probabilistic PD and simple estimates of current extinction probability. The list might be a dynamic one, with all the priority scores updated as extinction probabilities change. Results of recent studies suggest that estimation of extinction probabilities derived from the red list criteria linked to changes in species range sizes may provide estimated probabilities for many different species. Probabilistic PD provides a framework for single-species assessment that is well-integrated with a broader measurement of impacts on PD owing to climate change and other factors.

  13. 76 FR 18586 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on April 20, 2011, Room T-2B1, 11545...

  14. Lossed in translation: an off-the-shelf method to recover probabilistic beliefs from loss-averse agents.

    PubMed

    Offerman, Theo; Palley, Asa B

    2016-01-01

    Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.

  15. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  16. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  17. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  18. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  19. Dynamic Modeling of Ascent Abort Scenarios for Crewed Launches

    NASA Technical Reports Server (NTRS)

    Bigler, Mark; Boyer, Roger L.

    2015-01-01

    For the last 30 years, the United States' human space program has been focused on low Earth orbit exploration and operations with the Space Shuttle and International Space Station programs. After over 40 years, the U.S. is again working to return humans beyond Earth orbit. To do so, NASA is developing a new launch vehicle and spacecraft to provide this capability. The launch vehicle is referred to as the Space Launch System (SLS) and the spacecraft is called Orion. The new launch system is being developed with an abort system that will enable the crew to escape launch failures that would otherwise be catastrophic as well as probabilistic design requirements set for probability of loss of crew (LOC) and loss of mission (LOM). In order to optimize the risk associated with designing this new launch system, as well as verifying the associated requirements, NASA has developed a comprehensive Probabilistic Risk Assessment (PRA) of the integrated ascent phase of the mission that includes the launch vehicle, spacecraft and ground launch facilities. Given the dynamic nature of rocket launches and the potential for things to go wrong, developing a PRA to assess the risk can be a very challenging effort. Prior to launch and after the crew has boarded the spacecraft, the risk exposure time can be on the order of three hours. During this time, events may initiate from either the spacecraft, the launch vehicle, or the ground systems, thus requiring an emergency egress from the spacecraft to a safe ground location or a pad abort via the spacecraft's launch abort system. Following launch, again either the spacecraft or the launch vehicle can initiate the need for the crew to abort the mission and return home. Obviously, there are thousands of scenarios whose outcome depends on when the abort is initiated during ascent and how the abort is performed. This includes modeling the risk associated with explosions and benign system failures that require aborting a spacecraft under very dynamic conditions, particularly in the lower atmosphere, and returning the crew home safely. This paper will provide an overview of the PRA model that has been developed of this new launch system, including some of the challenges that are associated with this effort.

  20. Dynamic Modeling of Ascent Abort Scenarios for Crewed Launches

    NASA Technical Reports Server (NTRS)

    Bigler, Mark; Boyer, Roger L.

    2015-01-01

    For the last 30 years, the United States's human space program has been focused on low Earth orbit exploration and operations with the Space Shuttle and International Space Station programs. After nearly 50 years, the U.S. is again working to return humans beyond Earth orbit. To do so, NASA is developing a new launch vehicle and spacecraft to provide this capability. The launch vehicle is referred to as the Space Launch System (SLS) and the spacecraft is called Orion. The new launch system is being developed with an abort system that will enable the crew to escape launch failures that would otherwise be catastrophic as well as probabilistic design requirements set for probability of loss of crew (LOC) and loss of mission (LOM). In order to optimize the risk associated with designing this new launch system, as well as verifying the associated requirements, NASA has developed a comprehensive Probabilistic Risk Assessment (PRA) of the integrated ascent phase of the mission that includes the launch vehicle, spacecraft and ground launch facilities. Given the dynamic nature of rocket launches and the potential for things to go wrong, developing a PRA to assess the risk can be a very challenging effort. Prior to launch and after the crew has boarded the spacecraft, the risk exposure time can be on the order of three hours. During this time, events may initiate from either of the spacecraft, the launch vehicle, or the ground systems, thus requiring an emergency egress from the spacecraft to a safe ground location or a pad abort via the spacecraft's launch abort system. Following launch, again either the spacecraft or the launch vehicle can initiate the need for the crew to abort the mission and return to the home. Obviously, there are thousands of scenarios whose outcome depends on when the abort is initiated during ascent as to how the abort is performed. This includes modeling the risk associated with explosions and benign system failures that require aborting a spacecraft under very dynamic conditions, particularly in the lower atmosphere, and returning the crew home safely. This paper will provide an overview of the PRA model that has been developed of this new launch system, including some of the challenges that are associated with this effort. Key Words: PRA, space launches, human space program, ascent abort, spacecraft, launch vehicles

  1. Impact of Atmospheric Aerosols on Solar Photovoltaic Electricity Generation in China

    NASA Astrophysics Data System (ADS)

    Li, X.; Mauzerall, D. L.; Wagner, F.; Peng, W.; Yang, J.

    2016-12-01

    Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).

  2. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  3. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  4. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  5. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  6. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  7. Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires

    Treesearch

    Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes

    2010-01-01

    Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...

  8. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  10. Evidence-based risk communication: a systematic review.

    PubMed

    Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A

    2014-08-19

    Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.

  11. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  12. On a true value of risk

    NASA Astrophysics Data System (ADS)

    Kozine, Igor

    2018-04-01

    The paper suggests looking on probabilistic risk quantities and concepts through the prism of accepting one of the views: whether a true value of risk exists or not. It is argued that discussions until now have been primarily focused on closely related topics that are different from the topic of the current paper. The paper examines operational consequences of adhering to each of the views and contrasts them. It is demonstrated that operational differences on how and what probabilistic measures can be assessed and how they can be interpreted appear tangible. In particular, this concerns prediction intervals, the use of Byes rule, models of complete ignorance, hierarchical models of uncertainty, assignment of probabilities over possibility space and interpretation of derived probabilistic measures. Behavioural implications of favouring the either view are also briefly described.

  13. A regional-scale ecological risk framework for environmental flow evaluations

    NASA Astrophysics Data System (ADS)

    O'Brien, Gordon C.; Dickens, Chris; Hines, Eleanor; Wepener, Victor; Stassen, Retha; Quayle, Leo; Fouchy, Kelly; MacKenzie, James; Graham, P. Mark; Landis, Wayne G.

    2018-02-01

    Environmental flow (E-flow) frameworks advocate holistic, regional-scale, probabilistic E-flow assessments that consider flow and non-flow drivers of change in a socio-ecological context as best practice. Regional-scale ecological risk assessments of multiple stressors to social and ecological endpoints, which address ecosystem dynamism, have been undertaken internationally at different spatial scales using the relative-risk model since the mid-1990s. With the recent incorporation of Bayesian belief networks into the relative-risk model, a robust regional-scale ecological risk assessment approach is available that can contribute to achieving the best practice recommendations of E-flow frameworks. PROBFLO is a holistic E-flow assessment method that incorporates the relative-risk model and Bayesian belief networks (BN-RRM) into a transparent probabilistic modelling tool that addresses uncertainty explicitly. PROBFLO has been developed to evaluate the socio-ecological consequences of historical, current and future water resource use scenarios and generate E-flow requirements on regional spatial scales. The approach has been implemented in two regional-scale case studies in Africa where its flexibility and functionality has been demonstrated. In both case studies the evidence-based outcomes facilitated informed environmental management decision making, with trade-off considerations in the context of social and ecological aspirations. This paper presents the PROBFLO approach as applied to the Senqu River catchment in Lesotho and further developments and application in the Mara River catchment in Kenya and Tanzania. The 10 BN-RRM procedural steps incorporated in PROBFLO are demonstrated with examples from both case studies. PROBFLO can contribute to the adaptive management of water resources and contribute to the allocation of resources for sustainable use of resources and address protection requirements.

  14. Differential equation models for sharp threshold dynamics.

    PubMed

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  15. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  16. Costing the satellite power system

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1978-01-01

    The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.

  17. Comparative assessment of analytical approaches to quantify the risk for introduction of rare animal diseases: the example of avian influenza in Spain.

    PubMed

    Sánchez-Vizcaíno, Fernando; Perez, Andrés; Martínez-López, Beatriz; Sánchez-Vizcaíno, José Manuel

    2012-08-01

    Trade of animals and animal products imposes an uncertain and variable risk for exotic animal diseases introduction into importing countries. Risk analysis provides importing countries with an objective, transparent, and internationally accepted method for assessing that risk. Over the last decades, European Union countries have conducted probabilistic risk assessments quite frequently to quantify the risk for rare animal diseases introduction into their territories. Most probabilistic animal health risk assessments have been typically classified into one-level and multilevel binomial models. One-level models are more simple than multilevel models because they assume that animals or products originate from one single population. However, it is unknown whether such simplification may result in substantially different results compared to those obtained through the use of multilevel models. Here, data used on a probabilistic multilevel binomial model formulated to assess the risk for highly pathogenic avian influenza introduction into Spain were reanalyzed using a one-level binomial model and their outcomes were compared. An alternative ordinal model is also proposed here, which makes use of simpler assumptions and less information compared to those required by traditional one-level and multilevel approaches. Results suggest that, at least under certain circumstances, results of the one-level and ordinal approaches are similar to those obtained using multilevel models. Consequently, we argue that, when data are insufficient to run traditional probabilistic models, the ordinal approach presented here may be a suitable alternative to rank exporting countries in terms of the risk that they impose for the spread of rare animal diseases into disease-free countries. © 2012 Society for Risk Analysis.

  18. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A unified probabilistic approach to improve spelling in an event-related potential-based brain-computer interface.

    PubMed

    Kindermans, Pieter-Jan; Verschore, Hannes; Schrauwen, Benjamin

    2013-10-01

    In recent years, in an attempt to maximize performance, machine learning approaches for event-related potential (ERP) spelling have become more and more complex. In this paper, we have taken a step back as we wanted to improve the performance without building an overly complex model, that cannot be used by the community. Our research resulted in a unified probabilistic model for ERP spelling, which is based on only three assumptions and incorporates language information. On top of that, the probabilistic nature of our classifier yields a natural dynamic stopping strategy. Furthermore, our method uses the same parameters across 25 subjects from three different datasets. We show that our classifier, when enhanced with language models and dynamic stopping, improves the spelling speed and accuracy drastically. Additionally, we would like to point out that as our model is entirely probabilistic, it can easily be used as the foundation for complex systems in future work. All our experiments are executed on publicly available datasets to allow for future comparison with similar techniques.

  20. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  2. Markov Chain Model with Catastrophe to Determine Mean Time to Default of Credit Risky Assets

    NASA Astrophysics Data System (ADS)

    Dharmaraja, Selvamuthu; Pasricha, Puneet; Tardelli, Paola

    2017-11-01

    This article deals with the problem of probabilistic prediction of the time distance to default for a firm. To model the credit risk, the dynamics of an asset is described as a function of a homogeneous discrete time Markov chain subject to a catastrophe, the default. The behaviour of the Markov chain is investigated and the mean time to the default is expressed in a closed form. The methodology to estimate the parameters is given. Numerical results are provided to illustrate the applicability of the proposed model on real data and their analysis is discussed.

  3. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  4. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less

  5. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  6. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  7. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  8. Approximate probabilistic cellular automata for the dynamics of single-species populations under discrete logisticlike growth with and without weak Allee effects.

    PubMed

    Mendonça, J Ricardo G; Gevorgyan, Yeva

    2017-05-01

    We investigate one-dimensional elementary probabilistic cellular automata (PCA) whose dynamics in first-order mean-field approximation yields discrete logisticlike growth models for a single-species unstructured population with nonoverlapping generations. Beginning with a general six-parameter model, we find constraints on the transition probabilities of the PCA that guarantee that the ensuing approximations make sense in terms of population dynamics and classify the valid combinations thereof. Several possible models display a negative cubic term that can be interpreted as a weak Allee factor. We also investigate the conditions under which a one-parameter PCA derived from the more general six-parameter model can generate valid population growth dynamics. Numerical simulations illustrate the behavior of some of the PCA found.

  9. Probabilistic analysis of mean-response along-wind induced vibrations on wind turbine towers using wireless network data sensors

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, Raymond A.

    2011-04-01

    Wind turbine systems are attracting considerable attention due to concerns regarding global energy consumption as well as sustainability. Advances in wind turbine technology promote the tendency to improve efficiency in the structure that support and produce this renewable power source, tending toward more slender and larger towers, larger gear boxes, and larger, lighter blades. The structural design optimization process must account for uncertainties and nonlinear effects (such as wind-induced vibrations, unmeasured disturbances, and material and geometric variabilities). In this study, a probabilistic monitoring approach is developed that measures the response of the turbine tower to stochastic loading, estimates peak demand, and structural resistance (in terms of serviceability). The proposed monitoring system can provide a real-time estimate of the probability of exceedance of design serviceability conditions based on data collected in-situ. Special attention is paid to wind and aerodynamic characteristics that are intrinsically present (although sometimes neglected in health monitoring analysis) and derived from observations or experiments. In particular, little attention has been devoted to buffeting, usually non-catastrophic but directly impacting the serviceability of the operating wind turbine. As a result, modal-based analysis methods for the study and derivation of flutter instability, and buffeting response, have been successfully applied to the assessment of the susceptibility of high-rise slender structures, including wind turbine towers. A detailed finite element model has been developed to generate data (calibrated to published experimental and analytical results). Risk assessment is performed for the effects of along wind forces in a framework of quantitative risk analysis. Both structural resistance and wind load demands were considered probabilistic with the latter assessed by dynamic analyses.

  10. Seismic Risk Assessment for the Kyrgyz Republic

    NASA Astrophysics Data System (ADS)

    Pittore, Massimiliano; Sousa, Luis; Grant, Damian; Fleming, Kevin; Parolai, Stefano; Fourniadis, Yannis; Free, Matthew; Moldobekov, Bolot; Takeuchi, Ko

    2017-04-01

    The Kyrgyz Republic is one of the most socially and economically dynamic countries in Central Asia, and one of the most endangered by earthquake hazard in the region. In order to support the government of the Kyrgyz Republic in the development of a country-level Disaster Risk Reduction strategy, a comprehensive seismic risk study has been developed with the support of the World Bank. As part of this project, state-of-the-art hazard, exposure and vulnerability models have been developed and combined into the assessment of direct physical and economic risk on residential, educational and transportation infrastructure. The seismic hazard has been modelled with three different approaches, in order to provide a comprehensive overview of the possible consequences. A probabilistic seismic hazard assessment (PSHA) approach has been used to quantitatively evaluate the distribution of expected ground shaking intensity, as constrained by the compiled earthquake catalogue and associated seismic source model. A set of specific seismic scenarios based on events generated from known fault systems have been also considered, in order to provide insight on the expected consequences in case of strong events in proximity of densely inhabited areas. Furthermore, long-span catalogues of events have been generated stochastically and employed in the probabilistic analysis of expected losses over the territory of the Kyrgyz Republic. Damage and risk estimates have been computed by using an exposure model recently developed for the country, combined with the assignment of suitable fragility/vulnerability models. The risk estimation has been carried out with spatial aggregation at the district (rayon) level. The obtained results confirm the high level of seismic risk throughout the country, also pinpointing the location of several risk hotspots, particularly in the southern districts, in correspondence with the Ferghana valley. The outcome of this project will further support the local decision makers in implementing specific prevention and mitigation measures that are consistent with a broad risk reduction strategy.

  11. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.

  12. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  13. Compiling probabilistic, bio-inspired circuits on a field programmable analog array

    PubMed Central

    Marr, Bo; Hasler, Jennifer

    2014-01-01

    A field programmable analog array (FPAA) is presented as an energy and computational efficiency engine: a mixed mode processor for which functions can be compiled at significantly less energy costs using probabilistic computing circuits. More specifically, it will be shown that the core computation of any dynamical system can be computed on the FPAA at significantly less energy per operation than a digital implementation. A stochastic system that is dynamically controllable via voltage controlled amplifier and comparator thresholds is implemented, which computes Bernoulli random variables. From Bernoulli variables it is shown exponentially distributed random variables, and random variables of an arbitrary distribution can be computed. The Gillespie algorithm is simulated to show the utility of this system by calculating the trajectory of a biological system computed stochastically with this probabilistic hardware where over a 127X performance improvement over current software approaches is shown. The relevance of this approach is extended to any dynamical system. The initial circuits and ideas for this work were generated at the 2008 Telluride Neuromorphic Workshop. PMID:24847199

  14. Extended applications of track irregularity probabilistic model and vehicle-slab track coupled model on dynamics of railway systems

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Zhai, Wanming; Gao, Jianmin

    2017-11-01

    Track irregularities are inevitably in a process of stochastic evolution due to the uncertainty and continuity of wheel-rail interactions. For depicting the dynamic behaviours of vehicle-track coupling system caused by track random irregularities thoroughly, it is a necessity to develop a track irregularity probabilistic model to simulate rail surface irregularities with ergodic properties on amplitudes, wavelengths and probabilities, and to build a three-dimensional vehicle-track coupled model by properly considering the wheel-rail nonlinear contact mechanisms. In the present study, the vehicle-track coupled model is programmed by combining finite element method with wheel-rail coupling model firstly. Then, in light of the capability of power spectral density (PSD) in characterising amplitudes and wavelengths of stationary random signals, a track irregularity probabilistic model is presented to reveal and simulate the whole characteristics of track irregularity PSD. Finally, extended applications from three aspects, that is, extreme analysis, reliability analysis and response relationships between dynamic indices, are conducted to the evaluation and application of the proposed models.

  15. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  16. Towards a Proactive Risk Mitigation Strategy at La Fossa Volcano, Vulcano Island

    NASA Astrophysics Data System (ADS)

    Biass, S.; Gregg, C. E.; Frischknecht, C.; Falcone, J. L.; Lestuzzi, P.; di Traglia, F.; Rosi, M.; Bonadonna, C.

    2014-12-01

    A comprehensive risk assessment framework was built to develop proactive risk reduction measures for Vulcano Island, Italy. This framework includes identification of eruption scenarios; probabilistic hazard assessment, quantification of hazard impacts on the built environment, accessibility assessment on the island and risk perception study. Vulcano, a 21 km2 island with two primary communities host to 900 permanent residents and up to 10,000 visitors during summer, shows a strong dependency on the mainland for basic needs (water, energy) and relies on a ~2 month tourism season for its economy. The recent stratigraphy reveals a dominance of vulcanian and subplinian eruptions, producing a range of hazards acting at different time scales. We developed new methods to probabilistically quantify the hazard related to ballistics, lahars and tephra for all eruption styles. We also elaborated field- and GIS- based methods to assess the physical vulnerability of the built environment and created dynamic models of accessibility. Results outline the difference of hazard between short and long-lasting eruptions. A subplinian eruption has a 50% probability of impacting ~30% of the buildings within days after the eruption, but the year-long damage resulting from a long-lasting vulcanian eruption is similar if tephra is not removed from rooftops. Similarly, a subplinian eruption results in a volume of 7x105 m3 of material potentially remobilized into lahars soon after the eruption. Similar volumes are expected for a vulcanian activity over years, increasing the hazard of small lahars. Preferential lahar paths affect critical infrastructures lacking redundancy, such as the road network, communications systems, the island's only gas station, and access to the island's two evacuation ports. Such results from hazard, physical and systemic vulnerability help establish proactive volcanic risk mitigation strategies and may be applicable in other island settings.

  17. Don't Fear Optimality: Sampling for Probabilistic-Logic Sequence Models

    NASA Astrophysics Data System (ADS)

    Thon, Ingo

    One of the current challenges in artificial intelligence is modeling dynamic environments that change due to the actions or activities undertaken by people or agents. The task of inferring hidden states, e.g. the activities or intentions of people, based on observations is called filtering. Standard probabilistic models such as Dynamic Bayesian Networks are able to solve this task efficiently using approximative methods such as particle filters. However, these models do not support logical or relational representations. The key contribution of this paper is the upgrade of a particle filter algorithm for use with a probabilistic logical representation through the definition of a proposal distribution. The performance of the algorithm depends largely on how well this distribution fits the target distribution. We adopt the idea of logical compilation into Binary Decision Diagrams for sampling. This allows us to use the optimal proposal distribution which is normally prohibitively slow.

  18. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  19. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  20. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  1. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  2. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  3. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.

  4. Phase transitions in coupled map lattices and in associated probabilistic cellular automata.

    PubMed

    Just, Wolfram

    2006-10-01

    Analytical tools are applied to investigate piecewise linear coupled map lattices in terms of probabilistic cellular automata. The so-called disorder condition of probabilistic cellular automata is closely related with attracting sets in coupled map lattices. The importance of this condition for the suppression of phase transitions is illustrated by spatially one-dimensional systems. Invariant densities and temporal correlations are calculated explicitly. Ising type phase transitions are found for one-dimensional coupled map lattices acting on repelling sets and for a spatially two-dimensional Miller-Huse-like system with stable long time dynamics. Critical exponents are calculated within a finite size scaling approach. The relevance of detailed balance of the resulting probabilistic cellular automaton for the critical behavior is pointed out.

  5. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  6. Assessing dengue infection risk in the southern region of Taiwan: implications for control.

    PubMed

    Liao, C-M; Huang, T-L; Cheng, Y-H; Chen, W-Y; Hsieh, N-H; Chen, S-C; Chio, C-P

    2015-04-01

    Dengue, one of the most important mosquito-borne diseases, is a major international public health concern. This study aimed to assess potential dengue infection risk from Aedes aegypti in Kaohsiung and the implications for vector control. Here we investigated the impact of dengue transmission on human infection risk using a well-established dengue-mosquito-human transmission dynamics model. A basic reproduction number (R 0)-based probabilistic risk model was also developed to estimate dengue infection risk. Our findings confirm that the effect of biting rate plays a crucial role in shaping R 0 estimates. We demonstrated that there was 50% risk probability for increased dengue incidence rates exceeding 0.5-0.8 wk-1 for temperatures ranging from 26°C to 32°C. We further demonstrated that the weekly increased dengue incidence rate can be decreased to zero if vector control efficiencies reach 30-80% at temperatures of 19-32°C. We conclude that our analysis on dengue infection risk and control implications in Kaohsiung provide crucial information for policy-making on disease control.

  7. Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads

    NASA Technical Reports Server (NTRS)

    Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)

    2002-01-01

    Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.

  8. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  9. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  10. Design for Reliability and Safety Approach for the NASA New Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal, M.; Weldon, Danny M.

    2007-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new crew launch vehicle called ARES I. The ARES I is being developed by NASA Marshall Space Flight Center (MSFC) in support of the Constellation program. The ARES I consists of three major Elements: A solid First Stage (FS), an Upper Stage (US), and liquid Upper Stage Engine (USE). Stacked on top of the ARES I is the Crew exploration vehicle (CEV). The CEV consists of a Launch Abort System (LAS), Crew Module (CM), Service Module (SM), and a Spacecraft Adapter (SA). The CEV development is being led by NASA Johnson Space Center (JSC). Designing for high reliability and safety require a good integrated working environment and a sound technical design approach. The "Design for Reliability and Safety" approach addressed in this paper discusses both the environment and the technical process put in place to support the ARES I design. To address the integrated working environment, the ARES I project office has established a risk based design group called "Operability Design and Analysis" (OD&A) group. This group is an integrated group intended to bring together the engineering, design, and safety organizations together to optimize the system design for safety, reliability, and cost. On the technical side, the ARES I project has, through the OD&A environment, implemented a probabilistic approach to analyze and evaluate design uncertainties and understand their impact on safety, reliability, and cost. This paper focuses on the use of the various probabilistic approaches that have been pursued by the ARES I project. Specifically, the paper discusses an integrated functional probabilistic analysis approach that addresses upffont some key areas to support the ARES I Design Analysis Cycle (DAC) pre Preliminary Design (PD) Phase. This functional approach is a probabilistic physics based approach that combines failure probabilities with system dynamics and engineering failure impact models to identify key system risk drivers and potential system design requirements. The paper also discusses other probabilistic risk assessment approaches planned by the ARES I project to support the PD phase and beyond.

  11. The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Prinn, R. G.

    2013-12-01

    The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that we need to evaluate policies based on their ability to lower risk, and to re-evaluate decisions over time as new knowledge is gained. Reference: R. G. Prinn, Development and Application of Earth System Models, Proceedings, National Academy of Science, June 15, 2012, http://www.pnas.org/cgi/doi/10.1073/pnas.1107470109.

  12. Representing Causation

    ERIC Educational Resources Information Center

    Wolff, Phillip

    2007-01-01

    The dynamics model, which is based on L. Talmy's (1988) theory of force dynamics, characterizes causation as a pattern of forces and a position vector. In contrast to counterfactual and probabilistic models, the dynamics model naturally distinguishes between different cause-related concepts and explains the induction of causal relationships from…

  13. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  14. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  15. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  16. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  17. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  18. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  19. Probabilistic Assessment of Cancer Risk for Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2009-01-01

    During future lunar missions, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon transit. NASA s new lunar program anticipates that up to 15% of crew time may be on EVA, with minimal radiation shielding. For the operational challenge to respond to events of unknown size and duration, a probabilistic risk assessment approach is essential for mission planning and design. Using the historical database of proton measurements during the past 5 solar cycles, a typical hazard function for SPE occurrence was defined using a non-homogeneous Poisson model as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions ranging from the 5th to 95th percentile of particle fluences for a specified mission period were simulated. Organ doses corresponding to particle fluences at the median and at the 95th percentile for a specified mission period were assessed using NASA s baryon transport model, BRYNTRN. The cancer fatality risk for astronauts as functions of age, gender, and solar cycle activity were then analyzed. The probability of exceeding the NASA 30- day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated. Future work will involve using this probabilistic risk assessment approach to SPE forecasting, combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  20. Disease elimination and re-emergence in differential-equation models.

    PubMed

    Greenhalgh, Scott; Galvani, Alison P; Medlock, Jan

    2015-12-21

    Traditional differential equation models of disease transmission are often used to predict disease trajectories and evaluate the effectiveness of alternative intervention strategies. However, such models cannot account explicitly for probabilistic events, such as those that dominate dynamics when disease prevalence is low during the elimination and re-emergence phases of an outbreak. To account for the dynamics at low prevalence, i.e. the elimination and risk of disease re-emergence, without the added analytical and computational complexity of a stochastic model, we develop a novel application of control theory. We apply our approach to analyze historical data of measles elimination and re-emergence in Iceland from 1923 to 1938, predicting the temporal trajectory of local measles elimination and re-emerge as a result of disease migration from Copenhagen, Denmark. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  2. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.

  3. An equation-free probabilistic steady-state approximation: dynamic application to the stochastic simulation of biochemical reaction networks.

    PubMed

    Salis, Howard; Kaznessis, Yiannis N

    2005-12-01

    Stochastic chemical kinetics more accurately describes the dynamics of "small" chemical systems, such as biological cells. Many real systems contain dynamical stiffness, which causes the exact stochastic simulation algorithm or other kinetic Monte Carlo methods to spend the majority of their time executing frequently occurring reaction events. Previous methods have successfully applied a type of probabilistic steady-state approximation by deriving an evolution equation, such as the chemical master equation, for the relaxed fast dynamics and using the solution of that equation to determine the slow dynamics. However, because the solution of the chemical master equation is limited to small, carefully selected, or linear reaction networks, an alternate equation-free method would be highly useful. We present a probabilistic steady-state approximation that separates the time scales of an arbitrary reaction network, detects the convergence of a marginal distribution to a quasi-steady-state, directly samples the underlying distribution, and uses those samples to accurately predict the state of the system, including the effects of the slow dynamics, at future times. The numerical method produces an accurate solution of both the fast and slow reaction dynamics while, for stiff systems, reducing the computational time by orders of magnitude. The developed theory makes no approximations on the shape or form of the underlying steady-state distribution and only assumes that it is ergodic. We demonstrate the accuracy and efficiency of the method using multiple interesting examples, including a highly nonlinear protein-protein interaction network. The developed theory may be applied to any type of kinetic Monte Carlo simulation to more efficiently simulate dynamically stiff systems, including existing exact, approximate, or hybrid stochastic simulation techniques.

  4. Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.P.; Stover, R.L.; Hashimoto, P.S.

    1989-01-01

    Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.

  5. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  6. Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.

    PubMed

    Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente

    2009-12-20

    This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.

  7. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  8. Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.

    PubMed

    Herzallah, Randa

    2015-03-01

    Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  10. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  11. Probabilistic assessment of dynamic system performance. Part 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belhadj, Mohamed

    1993-01-01

    Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less

  12. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  13. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  14. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  15. PREDICT: Privacy and Security Enhancing Dynamic Information Monitoring

    DTIC Science & Technology

    2015-08-03

    consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided local...12], consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided...these methods achieve high sensing coverage with low cost using cloaked locations [3]. In follow-on work, the issue of mobility is addressed. Task

  16. Probabilistic Perception, Empathy, and Dynamic Homeostasis: Insights in Autism Spectrum Disorders and Conduct Disorders

    PubMed Central

    Guilé, Jean Marc

    2013-01-01

    Homeostasis is not a permanent and stable state but instead results from conflicting forces. Therefore, infants have to engage in dynamic exchanges with their environment, in biological, cognitive, and affective domains. Empathy is an adaptive response to these environmental challenges, which contributes to reaching proper dynamic homeostasis and development. Empathy relies on implicit interactive processes, namely probabilistic perception and synchrony, which will be reviewed in the article. If typically-developed neonates are fully equipped to automatically and synchronously interact with their human environment, conduct disorders (CD) and autism spectrum disorders (ASD) present with impairments in empathetic communication, e.g., emotional arousal and facial emotion processing. In addition sensorimotor resonance is lacking in ASD, and emotional concern and semantic empathy are impaired in CD with Callous-Unemotional traits. PMID:24479115

  17. Impact of Hydrologic Variability on Ecosystem Dynamics and the Sustainable Use of Soil and Water Resources

    NASA Astrophysics Data System (ADS)

    Porporato, A. M.

    2013-05-01

    We discuss the key processes by which hydrologic variability affects the probabilistic structure of soil moisture dynamics in water-controlled ecosystems. These in turn impact biogeochemical cycling and ecosystem structure through plant productivity and biodiversity as well as nitrogen availability and soil conditions. Once the long-term probabilistic structure of these processes is quantified, the results become useful to understand the impact of climatic changes and human activities on ecosystem services, and can be used to find optimal strategies of water and soil resources management under unpredictable hydro-climatic fluctuations. Particular applications regard soil salinization, phytoremediation and optimal stochastic irrigation.

  18. Application of dynamic uncertain causality graph in spacecraft fault diagnosis: Logic cycle

    NASA Astrophysics Data System (ADS)

    Yao, Quanying; Zhang, Qin; Liu, Peng; Yang, Ping; Zhu, Ma; Wang, Xiaochen

    2017-04-01

    Intelligent diagnosis system are applied to fault diagnosis in spacecraft. Dynamic Uncertain Causality Graph (DUCG) is a new probability graphic model with many advantages. In the knowledge expression of spacecraft fault diagnosis, feedback among variables is frequently encountered, which may cause directed cyclic graphs (DCGs). Probabilistic graphical models (PGMs) such as bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning, but BN does not allow DCGs. In this paper, DUGG is applied to fault diagnosis in spacecraft: introducing the inference algorithm for the DUCG to deal with feedback. Now, DUCG has been tested in 16 typical faults with 100% diagnosis accuracy.

  19. A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.

  20. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  1. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  2. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments.

    PubMed

    Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A

    2017-11-06

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.

  3. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.

    2016-12-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  4. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  5. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.

    2017-12-01

    The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  6. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn

    2017-04-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  7. On the probabilistic structure of water age: Probabilistic Water Age

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porporato, Amilcare; Calabrese, Salvatore

    We report the age distribution of water in hydrologic systems has received renewed interest recently, especially in relation to watershed response to rainfall inputs. The purpose of this contribution is first to draw attention to existing theories of age distributions in population dynamics, fluid mechanics and stochastic groundwater, and in particular to the McKendrick-von Foerster equation and its generalizations and solutions. A second and more important goal is to clarify that, when hydrologic fluxes are modeled by means of time-varying stochastic processes, the age distributions must themselves be treated as random functions. Once their probabilistic structure is obtained, it canmore » be used to characterize the variability of age distributions in real systems and thus help quantify the inherent uncertainty in the field determination of water age. Finally, we illustrate these concepts with reference to a stochastic storage model, which has been used as a minimalist model of soil moisture and streamflow dynamics.« less

  8. On the probabilistic structure of water age: Probabilistic Water Age

    DOE PAGES

    Porporato, Amilcare; Calabrese, Salvatore

    2015-04-23

    We report the age distribution of water in hydrologic systems has received renewed interest recently, especially in relation to watershed response to rainfall inputs. The purpose of this contribution is first to draw attention to existing theories of age distributions in population dynamics, fluid mechanics and stochastic groundwater, and in particular to the McKendrick-von Foerster equation and its generalizations and solutions. A second and more important goal is to clarify that, when hydrologic fluxes are modeled by means of time-varying stochastic processes, the age distributions must themselves be treated as random functions. Once their probabilistic structure is obtained, it canmore » be used to characterize the variability of age distributions in real systems and thus help quantify the inherent uncertainty in the field determination of water age. Finally, we illustrate these concepts with reference to a stochastic storage model, which has been used as a minimalist model of soil moisture and streamflow dynamics.« less

  9. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  10. Probabilistic combination of static and dynamic gait features for verification

    NASA Astrophysics Data System (ADS)

    Bazin, Alex I.; Nixon, Mark S.

    2005-03-01

    This paper describes a novel probabilistic framework for biometric identification and data fusion. Based on intra and inter-class variation extracted from training data, posterior probabilities describing the similarity between two feature vectors may be directly calculated from the data using the logistic function and Bayes rule. Using a large publicly available database we show the two imbalanced gait modalities may be fused using this framework. All fusion methods tested provide an improvement over the best modality, with the weighted sum rule giving the best performance, hence showing that highly imbalanced classifiers may be fused in a probabilistic setting; improving not only the performance, but also generalized application capability.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlomore » simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.« less

  12. Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.

    PubMed

    Wilhelm, C J; Mitchell, S H

    2008-10-01

    Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.

  13. Evaluating the Impact of Contaminant Dilution and Biodegradation in Uncertainty Quantification of Human Health Risk

    NASA Astrophysics Data System (ADS)

    Zarlenga, Antonio; de Barros, Felipe; Fiori, Aldo

    2016-04-01

    We present a probabilistic framework for assessing human health risk due to groundwater contamination. Our goal is to quantify how physical hydrogeological and biochemical parameters control the magnitude and uncertainty of human health risk. Our methodology captures the whole risk chain from the aquifer contamination to the tap water assumption by human population. The contaminant concentration, the key parameter for the risk estimation, is governed by the interplay between the large-scale advection, caused by heterogeneity and the degradation processes strictly related to the local scale dispersion processes. The core of the hazard identification and of the methodology is the reactive transport model: erratic displacement of contaminant in groundwater, due to the spatial variability of hydraulic conductivity (K), is characterized by a first-order Lagrangian stochastic model; different dynamics are considered as possible ways of biodegradation in aerobic and anaerobic conditions. With the goal of quantifying uncertainty, the Beta distribution is assumed for the concentration probability density function (pdf) model, while different levels of approximation are explored for the estimation of the one-point concentration moments. The information pertaining the flow and transport is connected with a proper dose response assessment which generally involves the estimation of physiological parameters of the exposed population. Human health response depends on the exposed individual metabolism (e.g. variability) and is subject to uncertainty. Therefore, the health parameters are intrinsically a stochastic. As a consequence, we provide an integrated in a global probabilistic human health risk framework which allows the propagation of the uncertainty from multiple sources. The final result, the health risk pdf, is expressed as function of a few relevant, physically-based parameters such as the size of the injection area, the Péclet number, the K structure metrics and covariance shape, reaction parameters pertaining to aerobic and anaerobic degradation processes respectively as well as the dose response parameters. Even though the final result assumes a relatively simple form, few numerical quadratures are required in order to evaluate the trajectory moments of the solute plume. In order to perform a sensitivity analysis we apply the methodology to a hypothetical case study. The scenario investigated is made by an aquifer which constitutes a water supply for a population where a continuous source of NAPL contaminant feeds a steady plume. The risk analysis is limited to carcinogenic compounds for which the well-known linear relation for human risk is assumed. Analysis performed shows few interesting findings: the risk distribution is strictly dependent on the pore scale dynamics that trigger dilution and mixing; biodegradation may involve a significant reduction of the risk.

  14. A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites

    NASA Astrophysics Data System (ADS)

    Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.

    2010-12-01

    Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.

  15. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    NASA Astrophysics Data System (ADS)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  16. Risk assessment for furan contamination through the food chain in Belgian children.

    PubMed

    Scholl, Georges; Huybrechts, Inge; Humblet, Marie-France; Scippo, Marie-Louise; De Pauw, Edwin; Eppe, Gauthier; Saegerman, Claude

    2012-08-01

    Young, old, pregnant and immuno-compromised persons are of great concern for risk assessors as they represent the sub-populations most at risk. The present paper focuses on risk assessment linked to furan exposure in children. Only the Belgian population was considered because individual contamination and consumption data that are required for accurate risk assessment were available for Belgian children only. Two risk assessment approaches, the so-called deterministic and probabilistic, were applied and the results were compared for the estimation of daily intake. A significant difference between the average Estimated Daily Intake (EDI) was underlined between the deterministic (419 ng kg⁻¹ body weight (bw) day⁻¹) and the probabilistic (583 ng kg⁻¹ bw day⁻¹) approaches, which results from the mathematical treatment of the null consumption and contamination data. The risk was characterised by two ways: (1) the classical approach by comparison of the EDI to a reference dose (RfD(chronic-oral)) and (2) the most recent approach, namely the Margin of Exposure (MoE) approach. Both reached similar conclusions: the risk level is not of a major concern, but is neither negligible. In the first approach, only 2.7 or 6.6% (respectively in the deterministic and in the probabilistic way) of the studied population presented an EDI above the RfD(chronic-oral). In the second approach, the percentage of children displaying a MoE above 10,000 and below 100 is 3-0% and 20-0.01% in the deterministic and probabilistic modes, respectively. In addition, children were compared to adults and significant differences between the contamination patterns were highlighted. While major contamination was linked to coffee consumption in adults (55%), no item predominantly contributed to the contamination in children. The most important were soups (19%), dairy products (17%), pasta and rice (11%), fruit and potatoes (9% each).

  17. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  18. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  19. A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS

    EPA Science Inventory

    We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...

  20. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  1. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  2. On the Measurement and Properties of Ambiguity in Probabilistic Expectations

    ERIC Educational Resources Information Center

    Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn

    2015-01-01

    Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…

  3. A probabilistic cellular automata model for the dynamics of a population driven by logistic growth and weak Allee effect

    NASA Astrophysics Data System (ADS)

    Mendonça, J. R. G.

    2018-04-01

    We propose and investigate a one-parameter probabilistic mixture of one-dimensional elementary cellular automata under the guise of a model for the dynamics of a single-species unstructured population with nonoverlapping generations in which individuals have smaller probability of reproducing and surviving in a crowded neighbourhood but also suffer from isolation and dispersal. Remarkably, the first-order mean field approximation to the dynamics of the model yields a cubic map containing terms representing both logistic and weak Allee effects. The model has a single absorbing state devoid of individuals, but depending on the reproduction and survival probabilities can achieve a stable population. We determine the critical probability separating these two phases and find that the phase transition between them is in the directed percolation universality class of critical behaviour.

  4. Reliability, Risk and Cost Trade-Offs for Composite Designs

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1996-01-01

    Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.

  5. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2015-10-01

    RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activaties are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focalizing on the input structure.« less

  6. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2016-02-01

    RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by input files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focusing on the input structure.« less

  7. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2017-03-01

    RAVEN is a generic software framework to perform parametric and probabilistic analy- sis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncer- tainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters thatmore » need to be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused to- ward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is in- terested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assess- ment capability to RELAP-7, currently under develop-ment at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with soft- ware such as RELAP5-3D, etc. The aim of this document is the explaination of the input requirements, focalizing on the input structure.« less

  8. Risk Assessment Guidance for Superfund (RAGS) Volume III: Part A

    EPA Pesticide Factsheets

    EPA's Risk Assessment Guidance for Superfund (RAGS) Volume 3A provides policies and guiding principles on the application of probabilistic risk assessment (PRA) methods to human health and ecological risk assessment in the EPA Superfund Program.

  9. A Dynamic Bayesian Network Model for the Production and Inventory Control

    NASA Astrophysics Data System (ADS)

    Shin, Ji-Sun; Takazaki, Noriyuki; Lee, Tae-Hong; Kim, Jin-Il; Lee, Hee-Hyol

    In general, the production quantities and delivered goods are changed randomly and then the total stock is also changed randomly. This paper deals with the production and inventory control using the Dynamic Bayesian Network. Bayesian Network is a probabilistic model which represents the qualitative dependence between two or more random variables by the graph structure, and indicates the quantitative relations between individual variables by the conditional probability. The probabilistic distribution of the total stock is calculated through the propagation of the probability on the network. Moreover, an adjusting rule of the production quantities to maintain the probability of a lower limit and a ceiling of the total stock to certain values is shown.

  10. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  11. Using Probabilistic Methods in Water Scarcity Assessments: A First Step Towards a Water Scarcity Risk Assessment Framework

    NASA Technical Reports Server (NTRS)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Phillip

    2016-01-01

    Water scarcity -driven by climate change, climate variability, and socioeconomic developments- is recognized as one of the most important global risks, both in terms of likelihood and impact. Whilst a wide range of studies have assessed the role of long term climate change and socioeconomic trends on global water scarcity, the impact of variability is less well understood. Moreover, the interactions between different forcing mechanisms, and their combined effect on changes in water scarcity conditions, are often neglected. Therefore, we provide a first step towards a framework for global water scarcity risk assessments, applying probabilistic methods to estimate water scarcity risks for different return periods under current and future conditions while using multiple climate and socioeconomic scenarios.

  12. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  13. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  14. Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections

    NASA Astrophysics Data System (ADS)

    Wakazuki, Y.

    2015-12-01

    A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.

  15. Attribution of UK Winter Floods to Anthropogenic Forcing

    NASA Astrophysics Data System (ADS)

    Schaller, N.; Alison, K.; Sparrow, S. N.; Otto, F. E. L.; Massey, N.; Vautard, R.; Yiou, P.; van Oldenborgh, G. J.; van Haren, R.; Lamb, R.; Huntingford, C.; Crooks, S.; Legg, T.; Weisheimer, A.; Bowery, A.; Miller, J.; Jones, R.; Stott, P.; Allen, M. R.

    2014-12-01

    Many regions of southern UK experienced severe flooding during the 2013/2014 winter. Simultaneously, large areas in the USA and Canada were struck by prolonged cold weather. At the time, the media and public asked whether the general rainy conditions over northern Europe and the cold weather over North America were caused by climate change. Providing an answer to this question is not trivial, but recent studies show that probabilistic event attribution is feasible. Using the citizen science project weather@home, we ran over 40'000 perturbed initial condition simulations of the 2013/2014 winter. These simulations fall into two categories: one set aims at simulating the world with climate change using observed sea surface temperatures while the second set is run with sea surface temperatures corresponding to a world that might have been without climate change. The relevant modelled variables are then downscaled by a hydrological model to obtain river flows. First results show that anthropogenic climate change led to a small but significant increase in the fractional attributable risk for 30-days peak flows for the river Thames. A single number can summarize the final result from probabilistic attribution studies indicating, for example, an increase, decrease or no change to the risk of the event occurring. However, communicating this to the public, media and other scientists remains challenging. The assumptions made in the chain of models used need to be explained. In addition, extreme events, like the UK floods of the 2013/2014 winter, are usually caused by a range of factors. While heavy precipitation events can be caused by dynamic and/or thermodynamic processes, floods occur only partly as a response to heavy precipitation. Depending on the catchment, they can be largely due to soil properties and conditions of the previous months. Probabilistic attribution studies are multidisciplinary and therefore all aspects need to be communicated properly.

  16. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  17. Learning to Estimate Dynamical State with Probabilistic Population Codes.

    PubMed

    Makin, Joseph G; Dichter, Benjamin K; Sabes, Philip N

    2015-11-01

    Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF), the parameters of which can be learned via latent-variable density estimation (the EM algorithm). The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, "probabilistic population codes." We show that a recurrent neural network-a modified form of an exponential family harmonium (EFH)-that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts) to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.

  18. Learning to Estimate Dynamical State with Probabilistic Population Codes

    PubMed Central

    Sabes, Philip N.

    2015-01-01

    Tracking moving objects, including one’s own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF), the parameters of which can be learned via latent-variable density estimation (the EM algorithm). The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, “probabilistic population codes.” We show that a recurrent neural network—a modified form of an exponential family harmonium (EFH)—that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts) to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states. PMID:26540152

  19. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  20. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  1. Probabilistic analysis of the efficiency of the damping devices against nuclear fuel container falling

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2017-07-01

    The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.

  2. A probabilistic QMRA of Salmonella in direct agricultural reuse of treated municipal wastewater.

    PubMed

    Amha, Yamrot M; Kumaraswamy, Rajkumari; Ahmad, Farrukh

    2015-01-01

    Developing reliable quantitative microbial risk assessment (QMRA) procedures aids in setting recommendations on reuse applications of treated wastewater. In this study, a probabilistic QMRA to determine the risk of Salmonella infections resulting from the consumption of edible crops irrigated with treated wastewater was conducted. Quantitative polymerase chain reaction (qPCR) was used to enumerate Salmonella spp. in post-disinfected samples, where they showed concentrations ranging from 90 to 1,600 cells/100 mL. The results were used to construct probabilistic exposure models for the raw consumption of three vegetables (lettuce, cabbage, and cucumber) irrigated with treated wastewater, and to estimate the disease burden using Monte Carlo analysis. The results showed elevated median disease burden, when compared with acceptable disease burden set by the World Health Organization, which is 10⁻⁶ disability-adjusted life years per person per year. Of the three vegetables considered, lettuce showed the highest risk of infection in all scenarios considered, while cucumber showed the lowest risk. The results of the Salmonella concentration obtained with qPCR were compared with the results of Escherichia coli concentration for samples taken on the same sampling dates.

  3. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments

    PubMed Central

    Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.

    2018-01-01

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804

  4. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  5. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  6. Data-driven Modeling of Metal-oxide Sensors with Dynamic Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Gosangi, Rakesh; Gutierrez-Osuna, Ricardo

    2011-09-01

    We present a data-driven probabilistic framework to model the transient response of MOX sensors modulated with a sequence of voltage steps. Analytical models of MOX sensors are usually built based on the physico-chemical properties of the sensing materials. Although building these models provides an insight into the sensor behavior, they also require a thorough understanding of the underlying operating principles. Here we propose a data-driven approach to characterize the dynamical relationship between sensor inputs and outputs. Namely, we use dynamic Bayesian networks (DBNs), probabilistic models that represent temporal relations between a set of random variables. We identify a set of control variables that influence the sensor responses, create a graphical representation that captures the causal relations between these variables, and finally train the model with experimental data. We validated the approach on experimental data in terms of predictive accuracy and classification performance. Our results show that DBNs can accurately predict the dynamic response of MOX sensors, as well as capture the discriminatory information present in the sensor transients.

  7. Evaluation of Dynamic Coastal Response to Sea-level Rise Modifies Inundation Likelihood

    NASA Technical Reports Server (NTRS)

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments, making assessments of SLR-induced hazards essential for informed decision making. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30x30m resolution predictions for more than 38,000 sq km of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  8. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  9. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  10. Probabilistic empirical prediction of seasonal climate: evaluation and potential applications

    NASA Astrophysics Data System (ADS)

    Dieppois, B.; Eden, J.; van Oldenborgh, G. J.

    2017-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of stakeholder-driven applications of the K-PREP system, including empirical forecasts for circumboreal fire activity.

  11. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  12. Alternate Methods in Refining the SLS Nozzle Plug Loads

    NASA Technical Reports Server (NTRS)

    Burbank, Scott; Allen, Andrew

    2013-01-01

    Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.

  13. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    PubMed

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  14. Probabilistic model predicts dynamics of vegetation biomass in a desert ecosystem in NW China

    PubMed Central

    Wang, Xin-ping; Schaffer, Benjamin Eli; Yang, Zhenlei; Rodriguez-Iturbe, Ignacio

    2017-01-01

    The temporal dynamics of vegetation biomass are of key importance for evaluating the sustainability of arid and semiarid ecosystems. In these ecosystems, biomass and soil moisture are coupled stochastic variables externally driven, mainly, by the rainfall dynamics. Based on long-term field observations in northwestern (NW) China, we test a recently developed analytical scheme for the description of the leaf biomass dynamics undergoing seasonal cycles with different rainfall characteristics. The probabilistic characterization of such dynamics agrees remarkably well with the field measurements, providing a tool to forecast the changes to be expected in biomass for arid and semiarid ecosystems under climate change conditions. These changes will depend—for each season—on the forecasted rate of rainy days, mean depth of rain in a rainy day, and duration of the season. For the site in NW China, the current scenario of an increase of 10% in rate of rainy days, 10% in mean rain depth in a rainy day, and no change in the season duration leads to forecasted increases in mean leaf biomass near 25% in both seasons. PMID:28584097

  15. Assuring Life in Composite Systems

    NASA Technical Reports Server (NTRS)

    Chamis, Christos c.

    2008-01-01

    A computational simulation method is presented to assure life in composite systems by using dynamic buckling of smart composite shells as an example. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 9% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load. The uncertainties in the electric field strength and smart material volume fraction have moderate effects and thereby in the assured life of the shell.

  16. A probabilistic topic model for clinical risk stratification from electronic health records.

    PubMed

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that incorporating risk scoring knowledge as prior information can improve the performance in risk stratification. Experimental results reveal that our models achieve competitive performance in risk stratification in comparison with existing supervised approaches. In addition, the unsupervised nature of our models makes them highly portable to the risk stratification tasks of various diseases. Moreover, patient sub-profiles and sub-profile-specific risk tiers generated by our models are coherent and informative, and provide significant potential to be explored for the further tasks, such as patient cohort analysis. We hypothesize that the proposed framework can readily meet the demand for risk stratification from a large volume of EHRs in an open-ended fashion. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Probabilistic confidence for decisions based on uncertain reliability estimates

    NASA Astrophysics Data System (ADS)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  18. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  19. Grid Inertial Response-Based Probabilistic Determination of Energy Storage System Capacity Under High Solar Penetration

    DOE PAGES

    Yue, Meng; Wang, Xiaoyu

    2015-07-01

    It is well-known that responsive battery energy storage systems (BESSs) are an effective means to improve the grid inertial response to various disturbances including the variability of the renewable generation. One of the major issues associated with its implementation is the difficulty in determining the required BESS capacity mainly due to the large amount of inherent uncertainties that cannot be accounted for deterministically. In this study, a probabilistic approach is proposed to properly size the BESS from the perspective of the system inertial response, as an application of probabilistic risk assessment (PRA). The proposed approach enables a risk-informed decision-making processmore » regarding (1) the acceptable level of solar penetration in a given system and (2) the desired BESS capacity (and minimum cost) to achieve an acceptable grid inertial response with a certain confidence level.« less

  20. Experiences with Probabilistic Analysis Applied to Controlled Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Giesy, Daniel P.

    2004-01-01

    This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.

  1. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  2. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given severities) and vulnerability (the probability of a limit state performance be reached, given a certain severity). Then, for each landslide all the exposed goods (structures and infrastructures) within the landslide area and within a buffer (representative of the maximum extension of a landslide given a reactivation), are counted. The risk is the product of the damage probability and the ratio of the exposed goods of each landslide to the whole assets exposed to the same type of landslides. Since the risk is computed numerically and by the same procedure applied to all landslides, it is free from any subjective assessment such as those implied in the qualitative methods.

  3. Dynamic shaping of dopamine signals during probabilistic Pavlovian conditioning.

    PubMed

    Hart, Andrew S; Clark, Jeremy J; Phillips, Paul E M

    2015-01-01

    Cue- and reward-evoked phasic dopamine activity during Pavlovian and operant conditioning paradigms is well correlated with reward-prediction errors from formal reinforcement learning models, which feature teaching signals in the form of discrepancies between actual and expected reward outcomes. Additionally, in learning tasks where conditioned cues probabilistically predict rewards, dopamine neurons show sustained cue-evoked responses that are correlated with the variance of reward and are maximal to cues predicting rewards with a probability of 0.5. Therefore, it has been suggested that sustained dopamine activity after cue presentation encodes the uncertainty of impending reward delivery. In the current study we examined the acquisition and maintenance of these neural correlates using fast-scan cyclic voltammetry in rats implanted with carbon fiber electrodes in the nucleus accumbens core during probabilistic Pavlovian conditioning. The advantage of this technique is that we can sample from the same animal and recording location throughout learning with single trial resolution. We report that dopamine release in the nucleus accumbens core contains correlates of both expected value and variance. A quantitative analysis of these signals throughout learning, and during the ongoing updating process after learning in probabilistic conditions, demonstrates that these correlates are dynamically encoded during these phases. Peak CS-evoked responses are correlated with expected value and predominate during early learning while a variance-correlated sustained CS signal develops during the post-asymptotic updating phase. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  5. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  6. Threshold exceedance risk assessment in complex space-time systems

    NASA Astrophysics Data System (ADS)

    Angulo, José M.; Madrid, Ana E.; Romero, José L.

    2015-04-01

    Environmental and health impact risk assessment studies most often involve analysis and characterization of complex spatio-temporal dynamics. Recent developments in this context are addressed, among other objectives, to proper representation of structural heterogeneities, heavy-tailed processes, long-range dependence, intermittency, scaling behavior, etc. Extremal behaviour related to spatial threshold exceedances can be described in terms of geometrical characteristics and distribution patterns of excursion sets, which are the basis for construction of risk-related quantities, such as in the case of evolutionary study of 'hotspots' and long-term indicators of occurrence of extremal episodes. Derivation of flexible techniques, suitable for both the application under general conditions and the interpretation on singularities, is important for practice. Modern risk theory, a developing discipline motivated by the need to establish solid general mathematical-probabilistic foundations for rigorous definition and characterization of risk measures, has led to the introduction of a variety of classes and families, ranging from some conceptually inspired by specific fields of applications, to some intended to provide generality and flexibility to risk analysts under parametric specifications, etc. Quantile-based risk measures, such as Value-at-Risk (VaR), Average Value-at-Risk (AVaR), and generalization to spectral measures, are of particular interest for assessment under very general conditions. In this work, we study the application of quantile-based risk measures in the spatio-temporal context in relation to certain geometrical characteristics of spatial threshold exceedance sets. In particular, we establish a closed-form relationship between VaR, AVaR, and the expected value of threshold exceedance areas and excess volumes. Conditional simulation allows us, by means of empirical global and local spatial cumulative distributions, the derivation of various statistics of practical interest, and subsequent construction of dynamic risk maps. Further, we study the implementation of static and dynamic spatial deformation under this setup, meaningful, among other aspects, for incorporation of heterogeneities and/or covariate effects, or consideration of external factors for risk measurement. We illustrate this approach though Environment and Health applications. This work is partially supported by grant MTM2012-32666 of the Spanish Ministry of Economy and Competitiveness (co-financed by FEDER).

  7. Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds

    EPA Science Inventory

    When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...

  8. How Can You Support RIDM/CRM/RM Through the Use of PRA

    NASA Technical Reports Server (NTRS)

    DoVemto. Tpmu

    2011-01-01

    Probabilistic Risk Assessment (PRA) is one of key Risk Informed Decision Making (RIDM) tools. It is a scenario-based methodology aimed at identifying and assessing Safety and Technical Performance risks in complex technological systems.

  9. Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos

    NASA Astrophysics Data System (ADS)

    Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo

    2002-12-01

    We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑i=1Wpqi)/(q-1) linearly increases with time. We then verify that Sqsen(t)-Sqsen(∞) vanishes like t-1/[qrel(W)-1] [qrel(W)>1]. We finally exhibit a new finite-size scaling, qrel(∞)-qrel(W)~W- |qsen|. This establishes quantitatively, for the first time, a long pursued relation between sensitivity to the initial conditions and relaxation, concepts which play central roles in nonextensive statistical mechanics.

  10. Probabilistic Component Mode Synthesis of Nondeterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1996-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. We present a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  11. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  12. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  13. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  14. Probabilistic eruption forecasting at short and long time scales

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  15. A Comparison of Risk Sensitive Path Planning Methods for Aircraft Emergency Landing

    NASA Technical Reports Server (NTRS)

    Meuleau, Nicolas; Plaunt, Christian; Smith, David E.; Smith, Tristan

    2009-01-01

    Determining the best site to land a damaged aircraft presents some interesting challenges for standard path planning techniques. There are multiple possible locations to consider, the space is 3-dimensional with dynamics, the criteria for a good path is determined by overall risk rather than distance or time, and optimization really matters, since an improved path corresponds to greater expected survival rate. We have investigated a number of different path planning methods for solving this problem, including cell decomposition, visibility graphs, probabilistic road maps (PRMs), and local search techniques. In their pure form, none of these techniques have proven to be entirely satisfactory - some are too slow or unpredictable, some produce highly non-optimal paths or do not find certain types of paths, and some do not cope well with the dynamic constraints when controllability is limited. In the end, we are converging towards a hybrid technique that involves seeding a roadmap with a layered visibility graph, using PRM to extend that roadmap, and using local search to further optimize the resulting paths. We describe the techniques we have investigated, report on our experiments with these techniques, and discuss when and why various techniques were unsatisfactory.

  16. Phylodynamics with Migration: A Computational Framework to Quantify Population Structure from Genomic Data.

    PubMed

    Kühnert, Denise; Stadler, Tanja; Vaughan, Timothy G; Drummond, Alexei J

    2016-08-01

    When viruses spread, outbreaks can be spawned in previously unaffected regions. Depending on the time and mode of introduction, each regional outbreak can have its own epidemic dynamics. The migration and phylodynamic processes are often intertwined and need to be taken into account when analyzing temporally and spatially structured virus data. In this article, we present a fully probabilistic approach for the joint reconstruction of phylodynamic history in structured populations (such as geographic structure) based on a multitype birth-death process. This approach can be used to quantify the spread of a pathogen in a structured population. Changes in epidemic dynamics through time within subpopulations are incorporated through piecewise constant changes in transmission parameters.We analyze a global human influenza H3N2 virus data set from a geographically structured host population to demonstrate how seasonal dynamics can be inferred simultaneously with the phylogeny and migration process. Our results suggest that the main migration path among the northern, tropical, and southern region represented in the sample analyzed here is the one leading from the tropics to the northern region. Furthermore, the time-dependent transmission dynamics between and within two HIV risk groups, heterosexuals and injecting drug users, in the Latvian HIV epidemic are investigated. Our analyses confirm that the Latvian HIV epidemic peaking around 2001 was mainly driven by the injecting drug user risk group. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  17. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, B

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  19. Probabilistic safety analysis for urgent situations following the accidental release of a pollutant in the atmosphere

    NASA Astrophysics Data System (ADS)

    Armand, P.; Brocheton, F.; Poulet, D.; Vendel, F.; Dubourg, V.; Yalamas, T.

    2014-10-01

    This paper is an original contribution to uncertainty quantification in atmospheric transport & dispersion (AT&D) at the local scale (1-10 km). It is proposed to account for the imprecise knowledge of the meteorological and release conditions in the case of an accidental hazardous atmospheric emission. The aim is to produce probabilistic risk maps instead of a deterministic toxic load map in order to help the stakeholders making their decisions. Due to the urge attached to such situations, the proposed methodology is able to produce such maps in a limited amount of time. It resorts to a Lagrangian particle dispersion model (LPDM) using wind fields interpolated from a pre-established database that collects the results from a computational fluid dynamics (CFD) model. This enables a decoupling of the CFD simulations from the dispersion analysis, thus a considerable saving of computational time. In order to make the Monte-Carlo-sampling-based estimation of the probability field even faster, it is also proposed to recourse to the use of a vector Gaussian process surrogate model together with high performance computing (HPC) resources. The Gaussian process (GP) surrogate modelling technique is coupled with a probabilistic principal component analysis (PCA) for reducing the number of GP predictors to fit, store and predict. The design of experiments (DOE) from which the surrogate model is built, is run over a cluster of PCs for making the total production time as short as possible. The use of GP predictors is validated by comparing the results produced by this technique with those obtained by crude Monte Carlo sampling.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less

  1. NSTS Orbiter auxiliary power unit turbine wheel cracking risk assessment

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Mcclung, R. C.; Torng, T. Y.

    1992-01-01

    The present investigation of turbine-wheel cracking problems in the hydrazine-fueled APU turbine wheel of the Space Shuttle Orbiter's Main Engines has indicated the efficacy of systematic probabilistic risk assessment in flight certification and safety resolution. Nevertheless, real crack-initiation and propagation problems do not lend themselves to purely analytical studies. The high-cycle fatigue problem is noted to generally be unsuited to probabilistic modeling, due to its extremely high degree of intrinsic scatter. In the case treated, the cracks appear to trend toward crack arrest in a low cycle fatigue mode, due to a detuning of the resonance model.

  2. Summer drought predictability over Europe: empirical versus dynamical forecasts

    NASA Astrophysics Data System (ADS)

    Turco, Marco; Ceglar, Andrej; Prodhomme, Chloé; Soret, Albert; Toreti, Andrea; Doblas-Reyes Francisco, J.

    2017-08-01

    Seasonal climate forecasts could be an important planning tool for farmers, government and insurance companies that can lead to better and timely management of seasonal climate risks. However, climate seasonal forecasts are often under-used, because potential users are not well aware of the capabilities and limitations of these products. This study aims at assessing the merits and caveats of a statistical empirical method, the ensemble streamflow prediction system (ESP, an ensemble based on reordering historical data) and an operational dynamical forecast system, the European Centre for Medium-Range Weather Forecasts—System 4 (S4) in predicting summer drought in Europe. Droughts are defined using the Standardized Precipitation Evapotranspiration Index for the month of August integrated over 6 months. Both systems show useful and mostly comparable deterministic skill. We argue that this source of predictability is mostly attributable to the observed initial conditions. S4 shows only higher skill in terms of ability to probabilistically identify drought occurrence. Thus, currently, both approaches provide useful information and ESP represents a computationally fast alternative to dynamical prediction applications for drought prediction.

  3. III: Use of biomarkers as Risk Indicators in Environmental Risk Assessment of oil based discharges offshore.

    PubMed

    Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M

    2017-06-01

    Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    PubMed

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  6. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  7. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  8. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  9. Asteroid Impact Risk: Ground Hazard versus Impactor Size

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan; Wheeler, Lorien; Dotson, Jessie; Aftosmis, Michael; Tarano, Ana

    2017-01-01

    We utilized a probabilistic asteroid impact risk (PAIR) model to stochastically assess the impact risk due to an ensemble population of Near-Earth Objects (NEOs). Concretely, we present the variation of risk with impactor size. Results suggest that large impactors dominate the average risk, even when only considering the subset of undiscovered NEOs.

  10. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.

  11. Dynamic isoperimetry and the geometry of Lagrangian coherent structures

    NASA Astrophysics Data System (ADS)

    Froyland, Gary

    2015-10-01

    The study of transport and mixing processes in dynamical systems is particularly important for the analysis of mathematical models of physical systems. We propose a novel, direct geometric method to identify subsets of phase space that remain strongly coherent over a finite time duration. This new method is based on a dynamic extension of classical (static) isoperimetric problems; the latter are concerned with identifying submanifolds with the smallest boundary size relative to their volume. The present work introduces dynamic isoperimetric problems; the study of sets with small boundary size relative to volume as they are evolved by a general dynamical system. We formulate and prove dynamic versions of the fundamental (static) isoperimetric (in)equalities; a dynamic Federer-Fleming theorem and a dynamic Cheeger inequality. We introduce a new dynamic Laplace operator and describe a computational method to identify coherent sets based on eigenfunctions of the dynamic Laplacian. Our results include formal mathematical statements concerning geometric properties of finite-time coherent sets, whose boundaries can be regarded as Lagrangian coherent structures. The computational advantages of our new approach are a well-separated spectrum for the dynamic Laplacian, and flexibility in appropriate numerical approximation methods. Finally, we demonstrate that the dynamic Laplace operator can be realised as a zero-diffusion limit of a newly advanced probabilistic transfer operator method [9] for finding coherent sets, which is based on small diffusion. Thus, the present approach sits naturally alongside the probabilistic approach [9], and adds a formal geometric interpretation.

  12. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  13. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  14. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  15. Probabilistic health risk assessment for arsenic intake through drinking groundwater in Taiwan's Pingtung Plain

    NASA Astrophysics Data System (ADS)

    Liang, C. P.; Chen, J. S.

    2017-12-01

    An abundant and inexpensive supply of groundwater is used to meet drinking, agriculture and aquaculture requirements of the residents in the Pingtung Plain. Long-term groundwater quality monitoring data indicate that the As content in groundwater in the Pingtung Plain exceeds the maximum level of 10 g/L recommended by the World Health Organization (WHO). The situation is further complicated by the fact that only 46.89% of population in the Pingtung Plain has been served with tap water, far below the national average of 92.93%. Considering there is a considerable variation in the measured concentrations, from below the detection limit (<0.1 g/L) to the maximum value of 544 g/L and the consumption rate and body weight of the individual, the conventional approach to conducting a human health risk assessment may be insufficient for health risk management. This study presents a probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater by local residents in the Pingtung Plain. The probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater is achieved using Monte Carlo simulation technique based on the hazard quotient (HQ) and target cancer risk (TR) established by the U.S. Environmental Protection Agency. This study demonstrates the importance of the individual variability of inorganic As intake through drinking groundwater consumption when evaluating a high exposure sub-group of the population who drink high As content groundwater.

  16. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  17. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  18. Application of reliability-centered-maintenance to BWR ECCS motor operator valve performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.; Choi, Y.A.

    1993-01-01

    This paper describes the application of reliability-centered maintenance (RCM) methods to plant probabilistic risk assessment (PRA) and safety analyses for four boiling water reactor emergency core cooling systems (ECCSs): (1) high-pressure coolant injection (HPCI); (2) reactor core isolation cooling (RCIC); (3) residual heat removal (RHR); and (4) core spray systems. Reliability-centered maintenance is a system function-based technique for improving a preventive maintenance program that is applied on a component basis. Those components that truly affect plant function are identified, and maintenance tasks are focused on preventing their failures. The RCM evaluation establishes the relevant criteria that preserve system function somore » that an RCM-focused approach can be flexible and dynamic.« less

  19. A breakthrough in neuroscience needs a "Nebulous Cartesian System" Oscillations, quantum dynamics and chaos in the brain and vegetative system.

    PubMed

    Başar, Erol; Güntekin, Bahar

    2007-04-01

    The Cartesian System is a fundamental conceptual and analytical framework related and interwoven with the concept and applications of Newtonian Dynamics. In order to analyze quantum processes physicist moved to a Probabilistic Cartesian System in which the causality principle became a probabilistic one. This means the trajectories of particles (obeying quantum rules) can be described only with the concept of cloudy wave packets. The approach to the brain-body-mind problem requires more than the prerequisite of modern physics and quantum dynamics. In the analysis of the brain-body-mind construct we have to include uncertain causalities and consequently multiple uncertain causalities. These multiple causalities originate from (1) nonlinear properties of the vegetative system (e.g. irregularities in biochemical transmitters, cardiac output, turbulences in the vascular system, respiratory apnea, nonlinear oscillatory interactions in peristalsis); (2) nonlinear behavior of the neuronal electricity (e.g. chaotic behavior measured by EEG), (3) genetic modulations, and (4) additional to these physiological entities nonlinear properties of physical processes in the body. The brain shows deterministic chaos with a correlation dimension of approx. D(2)=6, the smooth muscles approx. D(2)=3. According to these facts we propose a hyper-probabilistic approach or a hyper-probabilistic Cartesian System to describe and analyze the processes in the brain-body-mind system. If we add aspects as our sentiments, emotions and creativity to this construct, better said to this already hyper-probabilistic construct, this "New Cartesian System" is more than hyper-probabilistic, it is a nebulous system, we can predict the future only in a nebulous way; however, despite this chain of reasoning we can still provide predictions on brain-body-mind incorporations. We tentatively assume that the processes or mechanisms of the brain-body-mind system can be analyzed and predicted similar to the metaphor of "finding the walking path in a cloudy or foggy day". This is meant by stating "The Nebulous Cartesian System" (NCS). Descartes, at his time undertaking his genius step, did not possess the knowledge of today's physiology and modern physics; we think that the time has come to consider such a New Cartesian System. To deal with this, we propose the utilization of the Heisenberg S-Matrix and a modified version of the Feynman Diagrams which we call "Brain Feynman Diagrams". Another metaphor to consider within the oscillatory approach of the NCS is the "string theory". We also emphasize that fundamental steps should be undertaken in order to create the own dynamical framework of the brain-body-mind incorporation; suggestions or metaphors from physics and mathematics are useful; however, the grammar of the brains intrinsic language must be understood with the help of a new biologically founded, adaptive-probabilistic Cartesian system. This new Cartesian System will undergo mutations and transcend to the philosophy of Henri Bergson in parallel to the Evolution theory of Charles Darwin to open gateways for approaching the brain-body-mind problem.

  20. Dynamic Probabilistic Modeling of Environmental Emissions of Engineered Nanomaterials.

    PubMed

    Sun, Tian Yin; Bornhöft, Nikolaus A; Hungerbühler, Konrad; Nowack, Bernd

    2016-05-03

    The need for an environmental risk assessment for engineered nanomaterials (ENM) necessitates the knowledge about their environmental concentrations. Despite significant advances in analytical methods, it is still not possible to measure the concentrations of ENM in natural systems. Material flow and environmental fate models have been used to provide predicted environmental concentrations. However, almost all current models are static and consider neither the rapid development of ENM production nor the fact that many ENM are entering an in-use stock and are released with a lag phase. Here we use dynamic probabilistic material flow modeling to predict the flows of four ENM (nano-TiO2, nano-ZnO, nano-Ag and CNT) to the environment and to quantify their amounts in (temporary) sinks such as the in-use stock and ("final") environmental sinks such as soil and sediment. Caused by the increase in production, the concentrations of all ENM in all compartments are increasing. Nano-TiO2 had far higher concentrations than the other three ENM. Sediment showed in our worst-case scenario concentrations ranging from 6.7 μg/kg (CNT) to about 40 000 μg/kg (nano-TiO2). In most cases the concentrations in waste incineration residues are at the "mg/kg" level. The flows to the environment that we provide will constitute the most accurate and reliable input of masses for environmental fate models which are using process-based descriptions of the fate and behavior of ENM in natural systems and rely on accurate mass input parameters.

  1. Are engineered nano iron oxide particles safe? an environmental risk assessment by probabilistic exposure, effects and risk modeling.

    PubMed

    Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd

    2016-12-01

    Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.

  2. Assessing the polycyclic aromatic hydrocarbon (PAH) pollution of urban stormwater runoff: a dynamic modeling approach.

    PubMed

    Zheng, Yi; Lin, Zhongrong; Li, Hao; Ge, Yan; Zhang, Wei; Ye, Youbin; Wang, Xuejun

    2014-05-15

    Urban stormwater runoff delivers a significant amount of polycyclic aromatic hydrocarbons (PAHs), mostly of atmospheric origin, to receiving water bodies. The PAH pollution of urban stormwater runoff poses serious risk to aquatic life and human health, but has been overlooked by environmental modeling and management. This study proposed a dynamic modeling approach for assessing the PAH pollution and its associated environmental risk. A variable time-step model was developed to simulate the continuous cycles of pollutant buildup and washoff. To reflect the complex interaction among different environmental media (i.e. atmosphere, dust and stormwater), the dependence of the pollution level on antecedent weather conditions was investigated and embodied in the model. Long-term simulations of the model can be efficiently performed, and probabilistic features of the pollution level and its risk can be easily determined. The applicability of this approach and its value to environmental management was demonstrated by a case study in Beijing, China. The results showed that Beijing's PAH pollution of road runoff is relatively severe, and its associated risk exhibits notable seasonal variation. The current sweeping practice is effective in mitigating the pollution, but the effectiveness is both weather-dependent and compound-dependent. The proposed modeling approach can help identify critical timing and major pollutants for monitoring, assessing and controlling efforts to be focused on. The approach is extendable to other urban areas, as well as to other contaminants with similar fate and transport as PAHs. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Two Decades of WRF/CMAQ simulations over the continental United States: New approaches for performing dynamic model evaluation and determining confidence limits for ozone exceedances

    EPA Science Inventory

    Confidence in the application of models for forecasting and regulatory assessments is furthered by conducting four types of model evaluation: operational, dynamic, diagnostic, and probabilistic. Operational model evaluation alone does not reveal the confidence limits that can be ...

  4. Evaluation of dynamic coastal response to sea-level rise modifies inundation likelihood

    USGS Publications Warehouse

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments1, 2, making assessments of SLR-induced hazards essential for informed decision making3. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30 × 30 m resolution predictions for more than 38,000 km2 of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  5. Quantum-like model of unconscious–conscious dynamics

    PubMed Central

    Khrennikov, Andrei

    2015-01-01

    We present a quantum-like model of sensation–perception dynamics (originated in Helmholtz theory of unconscious inference) based on the theory of quantum apparatuses and instruments. We illustrate our approach with the model of bistable perception of a particular ambiguous figure, the Schröder stair. This is a concrete model for unconscious and conscious processing of information and their interaction. The starting point of our quantum-like journey was the observation that perception dynamics is essentially contextual which implies impossibility of (straightforward) embedding of experimental statistical data in the classical (Kolmogorov, 1933) framework of probability theory. This motivates application of nonclassical probabilistic schemes. And the quantum formalism provides a variety of the well-approved and mathematically elegant probabilistic schemes to handle results of measurements. The theory of quantum apparatuses and instruments is the most general quantum scheme describing measurements and it is natural to explore it to model the sensation–perception dynamics. In particular, this theory provides the scheme of indirect quantum measurements which we apply to model unconscious inference leading to transition from sensations to perceptions. PMID:26283979

  6. Catchment-scale herbicides transport: Theory and application

    NASA Astrophysics Data System (ADS)

    Bertuzzo, E.; Thomet, M.; Botter, G.; Rinaldo, A.

    2013-02-01

    This paper proposes and tests a model which couples the description of hydrologic flow and transport of herbicides at catchment scales. The model accounts for streamflow components' age to characterize short and long term fluctuations of herbicide flux concentrations in stream waters, whose peaks exceeding a toxic threshold are key to exposure risk of aquatic ecosystems. The model is based on a travel time formulation of transport embedding a source zone that describes near surface herbicide dynamics. To this aim we generalize a recently proposed scheme for the analytical derivation of travel time distributions to the case of solutes that can be partially taken up by transpiration and undergo chemical degradation. The framework developed is evaluated by comparing modeled hydrographs and atrazine chemographs with those measured in the Aabach agricultural catchment (Switzerland). The model proves reliable in defining complex transport features shaped by the interplay of long term processes, related to the persistence of solute components in soils, and short term dynamics related to storm inter-arrivals. The effects of stochasticity in rainfall patterns and application dates on concentrations and loads in runoff are assessed via Monte Carlo simulations, highlighting the crucial role played by the first rainfall event occurring after herbicide application. A probabilistic framework for critical determinants of exposure risk to aquatic communities is defined. Modeling of herbicides circulation at catchment scale thus emerges as essential tools for ecological risk assessment.

  7. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  8. Distribution and health risk assessment of trace metals in freshwater tilapia from three different aquaculture sites in Jelebu Region (Malaysia).

    PubMed

    Low, Kah Hin; Zain, Sharifuddin Md; Abas, Mhd Radzi; Md Salleh, Kaharudin; Teo, Yin Yin

    2015-06-15

    The trace metal concentrations in edible muscle of red tilapia (Oreochromis spp.) sampled from a former tin mining pool, concrete tank and earthen pond in Jelebu were analysed with microwave assisted digestion-inductively coupled plasma-mass spectrometry. Results were compared with established legal limits and the daily ingestion exposures simulated using the Monte Carlo algorithm for potential health risks. Among the metals investigated, arsenic was found to be the key contaminant, which may have arisen from the use of formulated feeding pellets. Although the risks of toxicity associated with consumption of red tilapia from the sites investigated were found to be within the tolerable range, the preliminary probabilistic estimation of As cancer risk shows that the 95th percentile risk level surpassed the benchmark level of 10(-5). In general, the probabilistic health risks associated with ingestion of red tilapia can be ranked as follows: former tin mining pool > concrete tank > earthen pond. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Advancing the citizen scientist's contributions to documenting and understanding natural hazards: a proof of concept for linking crowdsourced and remotely sensed data on landslide hazards in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Markert, K. N.

    2017-12-01

    Scientists, practitioners, policymakers, and citizen groups, share a role in ensuring "that all sectors have access to, understand and can use scientific information for better informed decision-making" (Sendai Framework 2015-2030). When it comes to understanding hazards and exposure, inventories on disaster events are often limited. Thus, there are many opportunities for citizen scientists to engage in improving the collective understanding—and ultimately reduction—of disaster risk. Landslides are very difficult to forecast on spatial and temporal scales meaningful for early warning and evacuation. Heuristic hazard mapping methods are very common in regional hazard zonation and rely on expert knowledge of previous events and local conditions, but they often lack a temporal component. As new data analysis packages are becoming more open and accessible, probabilistic approaches that consider high resolution spatial and temporal dimensions are becoming more common, but this is only possible when rich inventories of landslide events exist. The work presented offers a proof of concept on incorporating crowd-sourced data to improve landslide hazard model performance. Starting with a national inventory of 90 catalogued landslides in El Salvador for a study period of 1998 to 2011, we simulate the addition of over 600 additional crowd-sourced landslide events that would have been identified through human interpretation of high resolution imagery in the Google Earth time slider feature. There is a noticeable improvement in performance statistics between static heuristic hazard models and probabilistic models that incorporate the events identified by the "crowd." Such a dynamic incorporation of crowd-sourced data on hazard events is not so far-fetched. Given the engagement of "local observers" in El Salvador who augment in situ hydro-meteorological measurements, the growing access to Earth observation data to the lay person, and immense interest behind connecting citizen scientists to remote sensing data through hackathons such as the NASA Space Apps Challenges, we envision a much more dynamic, collective understanding of landslide hazards. Here we present a better scenario of what we could have known had data from the crowd been incorporated into probabilistic hazard models on a regular basis.

  10. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  11. An investigation into the probabilistic combination of quasi-static and random accelerations

    NASA Technical Reports Server (NTRS)

    Schock, R. W.; Tuell, L. P.

    1984-01-01

    The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.

  12. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  13. Monitoring and exposure assessment of pesticide residues in cowpea (Vigna unguiculata L. Walp) from five provinces of southern China.

    PubMed

    Huan, Zhibo; Xu, Zhi; Luo, Jinhui; Xie, Defang

    2016-11-01

    Residues of 14 pesticides were determined in 150 cowpea samples collected in five southern Chinese provinces in 2013 and 2014.70% samples were detected one or more residues. 61.3% samples were illegal mainly because of detection of unauthorized pesticides. 14.0% samples contained more than three pesticides. Deterministic and probabilistic methods were used to assess the chronic and acute risk of pesticides in cowpea to eight subgroups of people. Deterministic assessment showed that the estimated short-term intakes (ESTIs) of carbofuran were 1199.4%-2621.9% of the acute reference doses (ARfD) while the rates were 985.9%-4114.7% using probabilistic assessment. Probabilistic assessment showed 4.2%-7.8% subjects may suffer from unacceptable acute risk from carbofuran contaminated cowpeas from the five provinces (especially children). But undue concern is not necessary, because all the estimations are based on conservative assumption. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  15. Risk taking and adult attention deficit/hyperactivity disorder: A gap between real life behavior and experimental decision making.

    PubMed

    Pollak, Yehuda; Shalit, Reut; Aran, Adi

    2018-01-01

    Adults with attention deficit/hyperactivity disorder (ADHD) are prone to suboptimal decision making and risk taking. The aim of this study was to test performance on a theoretically-based probabilistic decision making task in well-characterized adults with and without ADHD, and examine the relation between experimental risk taking and history of real-life risk-taking behavior, defined as cigarette, alcohol, and street drug use. University students with and without ADHD completed a modified version of the Cambridge Gambling Test, in which they had to choose between alternatives varied by level of risk, and reported their history of substance use. Both groups showed similar patterns of risk taking on the experimental decision making task, suggesting that ADHD is not linked to low sensitivity to risk. Past and present substance use was more prevalent in adults with ADHD. These finding question the validity of experimental probabilistic decision making task as a valid model for ADHD-related risk-taking behavior. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A probabilistic drought forecasting framework: A combined dynamical and statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh

    In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less

  17. Hybrid Packet-Pheromone-Based Probabilistic Routing for Mobile Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Kashkouli Nejad, Keyvan; Shawish, Ahmed; Jiang, Xiaohong; Horiguchi, Susumu

    Ad-Hoc networks are collections of mobile nodes communicating using wireless media without any fixed infrastructure. Minimal configuration and quick deployment make Ad-Hoc networks suitable for emergency situations like natural disasters or military conflicts. The current Ad-Hoc networks can only support either high mobility or high transmission rate at a time because they employ static approaches in their routing schemes. However, due to the continuous expansion of the Ad-Hoc network size, node-mobility and transmission rate, the development of new adaptive and dynamic routing schemes has become crucial. In this paper we propose a new routing scheme to support high transmission rates and high node-mobility simultaneously in a big Ad-Hoc network, by combining a new proposed packet-pheromone-based approach with the Hint Based Probabilistic Protocol (HBPP) for congestion avoidance with dynamic path selection in packet forwarding process. Because of using the available feedback information, the proposed algorithm does not introduce any additional overhead. The extensive simulation-based analysis conducted in this paper indicates that the proposed algorithm offers small packet-latency and achieves a significantly higher delivery probability in comparison with the available Hint-Based Probabilistic Protocol (HBPP).

  18. Assessment of possible airborne impact from nuclear risk sites - Part II: probabilistic analysis of atmospheric transport patterns in Euro-Arctic region

    NASA Astrophysics Data System (ADS)

    Mahura, A. G.; Baklanov, A. A.

    2003-10-01

    The probabilistic analysis of atmospheric transport patterns from most important nuclear risk sites in the Euro-Arctic region is performed employing the methodology developed within the "Arctic Risk" Project of the NARP Programme (Baklanov and Mahura, 2003). The risk sites are the nuclear power plants in the Northwest Russia, Finland, Sweden, Lithuania, United Kingdom, and Germany as well as the Novaya Zemlya test site of Russia. The geographical regions of interest are the Northern and Central European countries and Northwest Russia. In this study, the employed research tools are the trajectory model to calculate a multiyear dataset of forward trajectories that originated over the risk site locations, and a set of statistical methods (including exploratory, cluster, and probability fields analyses) for analysis of trajectory modelling results. The probabilistic analyses of trajectory modelling results for eleven sites are presented as a set of various indicators of the risk sites possible impact on geographical regions and countries of interest. The nuclear risk site possible impact (on a particular geographical region, territory, country, site, etc.) due to atmospheric transport from the site after hypothetical accidental release of radioactivity can be properly estimated based on a combined interpretation of the indicators (simple characteristics, atmospheric transport pathways, airflow and fast transport probability fields, maximum reaching distance and maximum possible impact zone, typical transport time and precipitation factor fields) for different time periods (annual, seasonal, and monthly) for any selected site (both separately for each site or grouped for several sites) in the Euro-Arctic region. Such estimation could be the useful input information for the decision-making process, risk assessment, and planning of emergency response systems for sites of nuclear, chemical, and biological danger.

  19. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  20. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    NASA Technical Reports Server (NTRS)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  1. Probabilistic Asteroid Impact Risk Assessment for the Hypothetical PDC17 Impact Exercise

    NASA Technical Reports Server (NTRS)

    Wheeler, Lorien; Mathias, Donovan

    2017-01-01

    Performing impact risk assessment for the 2017 Planetary Defense Conference (PDC17) hypothetical impact exercise, to take place at the PDC17 conference, May 15-20, 2017. Impact scenarios and trajectories are developed and provided by NASA's Near Earth Objects Office at JPL (Paul Chodas). These results represent purely hypothetical impact scenarios, and do not reflect any known asteroid threat. Risk assessment was performed using the Probabilistic Asteroid Impact Risk (PAIR) model developed by the Asteroid Threat Assessment Project (ATAP) at NASA Ames Research Center. This presentation includes sample results that may be presented or used in discussions during the various stages of the impact exercisecenter dot Some cases represent alternate scenario options that may not be used during the actual impact exercise at the PDC17 conference. Updates to these initial assessments and/or additional scenario assessments may be performed throughout the impact exercise as different scenario options unfold.

  2. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less

  3. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China.

    PubMed

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-07

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  4. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China

    PubMed Central

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-01-01

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide. PMID:27713530

  5. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  6. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  7. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  9. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  10. Probabilistic Risk Assessment for Bone Fracture - Bone Fracture Risk Module (BFxRM)

    NASA Technical Reports Server (NTRS)

    Licata, Angelo; Myers, Jerry G.; Lewandowski, Beth

    2013-01-01

    This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM). The overview includes an assessmnet of strenghts and limitations of the BFxRM and proposes a numebr of discussion questions to the panel regarding future development avenues for this simulation system.

  11. Review of methods for developing probabilistic risk assessments

    Treesearch

    D. A. Weinstein; P.B. Woodbury

    2010-01-01

    We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...

  12. Risk assessment for biodiversity conservation planning in Pacific Northwest forests

    Treesearch

    Becky K. Kerns; Alan Ager

    2007-01-01

    Risk assessment can provide a robust strategy for landscape-scale planning challenges associated with species conservation and habitat protection in Pacific Northwest forests. We provide an overview of quantitative and probabilistic ecological risk assessment with focus on the application of approaches and influences from the actuarial, financial, and technical...

  13. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  14. A nonlinear dynamics approach for incorporating wind-speed patterns into wind-power project evaluation.

    PubMed

    Huffaker, Ray; Bittelli, Marco

    2015-01-01

    Wind-energy production may be expanded beyond regions with high-average wind speeds (such as the Midwest U.S.A.) to sites with lower-average speeds (such as the Southeast U.S.A.) by locating favorable regional matches between natural wind-speed and energy-demand patterns. A critical component of wind-power evaluation is to incorporate wind-speed dynamics reflecting documented diurnal and seasonal behavioral patterns. Conventional probabilistic approaches remove patterns from wind-speed data. These patterns must be restored synthetically before they can be matched with energy-demand patterns. How to accurately restore wind-speed patterns is a vexing problem spurring an expanding line of papers. We propose a paradigm shift in wind power evaluation that employs signal-detection and nonlinear-dynamics techniques to empirically diagnose whether synthetic pattern restoration can be avoided altogether. If the complex behavior of observed wind-speed records is due to nonlinear, low-dimensional, and deterministic system dynamics, then nonlinear dynamics techniques can reconstruct wind-speed dynamics from observed wind-speed data without recourse to conventional probabilistic approaches. In the first study of its kind, we test a nonlinear dynamics approach in an application to Sugarland Wind-the first utility-scale wind project proposed in Florida, USA. We find empirical evidence of a low-dimensional and nonlinear wind-speed attractor characterized by strong temporal patterns that match up well with regular daily and seasonal electricity demand patterns.

  15. A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.

    PubMed

    Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C

    2018-05-03

    The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.

  16. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford Kuofei

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less

  17. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  18. Progress report on the Worldwide Earthquake Risk Management (WWERM) Program

    USGS Publications Warehouse

    Algermissen, S.T.; Hays, Walter W.; Krumpe, Paul R.

    1992-01-01

    Considerable progress has been made in the Worldwide Earthquake Risk Management (WWERM) Program since its initiation in late 1989 as a cooperative program of the Agency for International Development (AID), Office of U.S. Foreign Disaster Assistance (OFDA), and the U.S. Geological Survey. Probabilistic peak acceleration and peak Modified Mercalli intensity (MMI) maps have been prepared for Chile and for Sulawesi province in Indonesia. Earthquake risk (loss) studies for dwellings in Gorontalo, North Sulawesi, have been completed and risk studies for dwellings in selected areas of central Chile are underway. A special study of the effect of site response on earthquake ground motion estimation in central Chile has also been completed and indicates that site response may modify the ground shaking by as much as plus or minus two units of MMI. A program for the development of national probabilistic ground motion maps for the Philippines is now underway and pilot studies of earthquake ground motion and risk are being planned for Morocco.

  19. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    DTIC Science & Technology

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault

  20. An ontology-based nurse call management system (oNCS) with probabilistic priority assessment

    PubMed Central

    2011-01-01

    Background The current, place-oriented nurse call systems are very static. A patient can only make calls with a button which is fixed to a wall of a room. Moreover, the system does not take into account various factors specific to a situation. In the future, there will be an evolution to a mobile button for each patient so that they can walk around freely and still make calls. The system would become person-oriented and the available context information should be taken into account to assign the correct nurse to a call. The aim of this research is (1) the design of a software platform that supports the transition to mobile and wireless nurse call buttons in hospitals and residential care and (2) the design of a sophisticated nurse call algorithm. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff members and patients into account. Additionally, the priority of a call probabilistically depends on the risk factors, assigned to a patient. Methods The ontology-based Nurse Call System (oNCS) was developed as an extension of a Context-Aware Service Platform. An ontology is used to manage the profile information. Rules implement the novel nurse call algorithm that takes all this information into account. Probabilistic reasoning algorithms are designed to determine the priority of a call based on the risk factors of the patient. Results The oNCS system is evaluated through a prototype implementation and simulations, based on a detailed dataset obtained from Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls amongst nurses and the assignment of priorities to calls are compared for the oNCS system and the current, place-oriented nurse call system. Additionally, the performance of the system is discussed. Conclusions The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the oNCS system significantly improves the assignment of nurses to calls. Calls generally have a nurse present faster and the workload-distribution amongst the nurses improves. PMID:21294860

  1. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  2. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  3. The probabilistic convolution tree: efficient exact Bayesian inference for faster LC-MS/MS protein inference.

    PubMed

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called "causal independence"). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to O(k log(k)2) and the space to O(k log(k)) where k is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions.

  4. The Probabilistic Convolution Tree: Efficient Exact Bayesian Inference for Faster LC-MS/MS Protein Inference

    PubMed Central

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called “causal independence”). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to and the space to where is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions. PMID:24626234

  5. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  6. Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse

    EPA Science Inventory

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...

  7. Probabilistic assessment of the dynamic interaction between multiple pedestrians and vertical vibrations of footbridges

    NASA Astrophysics Data System (ADS)

    Tubino, Federica

    2018-03-01

    The effect of human-structure interaction in the vertical direction for footbridges is studied based on a probabilistic approach. The bridge is modeled as a continuous dynamic system, while pedestrians are schematized as moving single-degree-of-freedom systems with random dynamic properties. The non-dimensional form of the equations of motion allows us to obtain results that can be applied in a very wide set of cases. An extensive Monte Carlo simulation campaign is performed, varying the main non-dimensional parameters identified, and the mean values and coefficients of variation of the damping ratio and of the non-dimensional natural frequency of the coupled system are reported. The results obtained can be interpreted from two different points of view. If the characterization of pedestrians' equivalent dynamic parameters is assumed as uncertain, as revealed from a current literature review, then the paper provides a range of possible variations of the coupled system damping ratio and natural frequency as a function of pedestrians' parameters. Assuming that a reliable characterization of pedestrians' dynamic parameters is available (which is not the case at present, but could be in the future), the results presented can be adopted to estimate the damping ratio and natural frequency of the coupled footbridge-pedestrian system for a very wide range of real structures.

  8. Near-Earth Phase Risk Comparison of Human Mars Campaign Architectures

    NASA Technical Reports Server (NTRS)

    Manning, Ted A.; Nejad, Hamed S.; Mattenberger, Chris

    2013-01-01

    A risk analysis of the launch, orbital assembly, and Earth-departure phases of human Mars exploration campaign architectures was completed as an extension of a probabilistic risk assessment (PRA) originally carried out under the NASA Constellation Program Ares V Project. The objective of the updated analysis was to study the sensitivity of loss-of-campaign risk to such architectural factors as composition of the propellant delivery portion of the launch vehicle fleet (Ares V heavy-lift launch vehicle vs. smaller/cheaper commercial launchers) and the degree of launcher or Mars-bound spacecraft element sparing. Both a static PRA analysis and a dynamic, event-based Monte Carlo simulation were developed and used to evaluate the probability of loss of campaign under different sparing options. Results showed that with no sparing, loss-of-campaign risk is strongly driven by launcher count and on-orbit loiter duration, favoring an all-Ares V launch approach. Further, the reliability of the all-Ares V architecture showed significant improvement with the addition of a single spare launcher/payload. Among architectures utilizing a mix of Ares V and commercial launchers, those that minimized the on-orbit loiter duration of Mars-bound elements were found to exceed the reliability of no spare all-Ares V campaign if unlimited commercial vehicle sparing was assumed

  9. Dynamical-statistical seasonal prediction for western North Pacific typhoons based on APCC multi-models

    NASA Astrophysics Data System (ADS)

    Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi

    2017-01-01

    This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.

  10. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  11. Risk-based enteric pathogen reduction targets for non-potable and direct potable use of roof runoff, stormwater, and greywater

    EPA Science Inventory

    This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was use...

  12. Aviation Security, Risk Assessment, and Risk Aversion for Public Decisionmaking

    ERIC Educational Resources Information Center

    Stewart, Mark G.; Mueller, John

    2013-01-01

    This paper estimates risk reductions for each layer of security designed to prevent commercial passenger airliners from being commandeered by terrorists, kept under control for some time, and then crashed into specific targets. Probabilistic methods are used to characterize the uncertainty of rates of deterrence, detection, and disruption, as well…

  13. The Integrated Medical Model - A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma

    2010-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.

  14. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

  15. The ARGO Project: assessing NA-TECH risks on off-shore oil platforms

    NASA Astrophysics Data System (ADS)

    Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo

    2017-04-01

    ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.

  16. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  17. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    NASA Astrophysics Data System (ADS)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  18. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  19. Probabilistic Risk Assessment Process for High-Power Laser Operations in Outdoor Environments

    DTIC Science & Technology

    2016-01-01

    avionics data bus. In the case of a UAS-mounted laser system, the control path will additionally include a radio or satellite communications link. A remote...JBSA Fort Sam Houston, TX 78234 711 HPW/RHDO 11 . SPONSOR’S/MONITOR’S REPORT NUMBER(S) AFRL-RH-FS-JA-2015...hazard assessment pur- poses is not widespread within the laser safety community . The aim of this paper is to outline the basis of the probabilistic

  20. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  1. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  2. Regional probabilistic risk assessment of heavy metals in different environmental media and land uses: An urbanization-affected drinking water supply area

    NASA Astrophysics Data System (ADS)

    Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping

    2016-11-01

    In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.

  3. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    DOT National Transportation Integrated Search

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  4. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  5. Aggregate exposure approaches for parabens in personal care products: a case assessment for children between 0 and 3 years old

    PubMed Central

    Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke

    2014-01-01

    In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276

  6. Application of Probabilistic Modeling to Quantify the Reduction Levels of Hepatocellular Carcinoma Risk Attributable to Chronic Aflatoxins Exposure.

    PubMed

    Wambui, Joseph M; Karuri, Edward G; Ojiambo, Julia A; Njage, Patrick M K

    2017-01-01

    Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 ± 514 and 158 ± 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 ± 518 cases per annum. Consumption and contamination levels contributed significantly to the risk whereby lower age groups were most affected. Nonetheless, pre- and postharvest interventions might reduce the risk by 23.0-83.4% and 4.8-95.1%, respectively. Therefore, chronic exposure to aflatoxins increases the HCC risk in rural Kenya, but a significant reduction of the risk can be achieved by applying specific pre- and postharvest interventions.

  7. Modeling the Evolution of Beliefs Using an Attentional Focus Mechanism

    PubMed Central

    Marković, Dimitrije; Gläscher, Jan; Bossaerts, Peter; O’Doherty, John; Kiebel, Stefan J.

    2015-01-01

    For making decisions in everyday life we often have first to infer the set of environmental features that are relevant for the current task. Here we investigated the computational mechanisms underlying the evolution of beliefs about the relevance of environmental features in a dynamical and noisy environment. For this purpose we designed a probabilistic Wisconsin card sorting task (WCST) with belief solicitation, in which subjects were presented with stimuli composed of multiple visual features. At each moment in time a particular feature was relevant for obtaining reward, and participants had to infer which feature was relevant and report their beliefs accordingly. To test the hypothesis that attentional focus modulates the belief update process, we derived and fitted several probabilistic and non-probabilistic behavioral models, which either incorporate a dynamical model of attentional focus, in the form of a hierarchical winner-take-all neuronal network, or a diffusive model, without attention-like features. We used Bayesian model selection to identify the most likely generative model of subjects’ behavior and found that attention-like features in the behavioral model are essential for explaining subjects’ responses. Furthermore, we demonstrate a method for integrating both connectionist and Bayesian models of decision making within a single framework that allowed us to infer hidden belief processes of human subjects. PMID:26495984

  8. An online tool for Operational Probabilistic Drought Forecasting System (OPDFS): a Statistical-Dynamical Framework

    NASA Astrophysics Data System (ADS)

    Zarekarizi, M.; Moradkhani, H.; Yan, H.

    2017-12-01

    The Operational Probabilistic Drought Forecasting System (OPDFS) is an online tool recently developed at Portland State University for operational agricultural drought forecasting. This is an integrated statistical-dynamical framework issuing probabilistic drought forecasts monthly for the lead times of 1, 2, and 3 months. The statistical drought forecasting method utilizes copula functions in order to condition the future soil moisture values on the antecedent states. Due to stochastic nature of land surface properties, the antecedent soil moisture states are uncertain; therefore, data assimilation system based on Particle Filtering (PF) is employed to quantify the uncertainties associated with the initial condition of the land state, i.e. soil moisture. PF assimilates the satellite soil moisture data to Variable Infiltration Capacity (VIC) land surface model and ultimately updates the simulated soil moisture. The OPDFS builds on the NOAA's seasonal drought outlook by offering drought probabilities instead of qualitative ordinal categories and provides the user with the probability maps associated with a particular drought category. A retrospective assessment of the OPDFS showed that the forecasting of the 2012 Great Plains and 2014 California droughts were possible at least one month in advance. The OPDFS offers a timely assistance to water managers, stakeholders and decision-makers to develop resilience against uncertain upcoming droughts.

  9. Dynamic Uncertain Causality Graph for Knowledge Representation and Probabilistic Reasoning: Directed Cyclic Graph and Joint Probability Distribution.

    PubMed

    Zhang, Qin

    2015-07-01

    Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.

  10. Incorporating psychological influences in probabilistic cost analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the scope and magnitude of the cost-overrun problem, the benefits are likely to be significant.« less

  11. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  12. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  13. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  14. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    PubMed

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  15. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair

    PubMed Central

    Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon

    2017-01-01

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768

  16. Analysis and probabilistic risk assessment of bioaccessible arsenic in polished and husked jasmine rice sold in Bangkok.

    PubMed

    Hensawang, Supanad; Chanpiwat, Penradee

    2018-09-01

    Food is one of the major sources of arsenic (As) exposure in humans. The objectives of this study were to determine the bioaccessible concentration of As in rice grain sold in Bangkok and to evaluate the potential health risks associated with rice consumption. Polished (n = 32) and husked (n = 17) jasmine rice were collected from local markets. In vitro digestion was performed to determine the bioaccessible As concentrations, which were used for probabilistic health risk assessments in different age groups of the population. Approximately 43.0% and 44.4% of the total As in the grain of polished and husked rice, respectively, was in the form of bioaccessible As. Significantly higher bioaccessible As concentrations were found in husked rice than in polished rice (1.5-3.8 times greater). The concentrations of bioaccessible As in polished and husked rice were lower than the Codex standard for As in rice. The average daily dose of As via rice consumption is equivalent to the daily ingestion of 2 L of water containing approximately 3.2-7.2 μg L -1 of As. Approximately 0.2%-13.7% and 10.7%-55.3% of the population may experience non-carcinogenic effects from polished and husked rice consumption, respectively. Approximately 1%-11.6% of children and 74.1%-99.8% of adults were at risk of cancer. The maximum cancer probabilities were 3 children and 6 adults in 10,000 individuals. The probabilistic risk results indicated that children and adults were at risk of both non-carcinogenic and carcinogenic effects from both types of rice consumption. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Clinical predictors of conversion to bipolar disorder in a prospective longitudinal familial high-risk sample: focus on depressive features.

    PubMed

    Frankland, Andrew; Roberts, Gloria; Holmes-Preston, Ellen; Perich, Tania; Levy, Florence; Lenroot, Rhoshel; Hadzi-Pavlovic, Dusan; Breakspear, Michael; Mitchell, Philip B

    2017-11-07

    Identifying clinical features that predict conversion to bipolar disorder (BD) in those at high familial risk (HR) would assist in identifying a more focused population for early intervention. In total 287 participants aged 12-30 (163 HR with a first-degree relative with BD and 124 controls (CONs)) were followed annually for a median of 5 years. We used the baseline presence of DSM-IV depressive, anxiety, behavioural and substance use disorders, as well as a constellation of specific depressive symptoms (as identified by the Probabilistic Approach to Bipolar Depression) to predict the subsequent development of hypo/manic episodes. At baseline, HR participants were significantly more likely to report ⩾4 Probabilistic features (40.4%) when depressed than CONs (6.7%; p < .05). Nineteen HR subjects later developed either threshold (n = 8; 4.9%) or subthreshold (n = 11; 6.7%) hypo/mania. The presence of ⩾4 Probabilistic features was associated with a seven-fold increase in the risk of 'conversion' to threshold BD (hazard ratio = 6.9, p < .05) above and beyond the fourteen-fold increase in risk related to major depressive episodes (MDEs) per se (hazard ratio = 13.9, p < .05). Individual depressive features predicting conversion were psychomotor retardation and ⩾5 MDEs. Behavioural disorders only predicted conversion to subthreshold BD (hazard ratio = 5.23, p < .01), while anxiety and substance disorders did not predict either threshold or subthreshold hypo/mania. This study suggests that specific depressive characteristics substantially increase the risk of young people at familial risk of BD going on to develop future hypo/manic episodes and may identify a more targeted HR population for the development of early intervention programs.

  18. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  19. On the probabilistic structure of water age

    NASA Astrophysics Data System (ADS)

    Porporato, Amilcare; Calabrese, Salvatore

    2015-05-01

    The age distribution of water in hydrologic systems has received renewed interest recently, especially in relation to watershed response to rainfall inputs. The purpose of this contribution is first to draw attention to existing theories of age distributions in population dynamics, fluid mechanics and stochastic groundwater, and in particular to the McKendrick-von Foerster equation and its generalizations and solutions. A second and more important goal is to clarify that, when hydrologic fluxes are modeled by means of time-varying stochastic processes, the age distributions must themselves be treated as random functions. Once their probabilistic structure is obtained, it can be used to characterize the variability of age distributions in real systems and thus help quantify the inherent uncertainty in the field determination of water age. We illustrate these concepts with reference to a stochastic storage model, which has been used as a minimalist model of soil moisture and streamflow dynamics.

  20. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  1. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  2. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  3. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.

  4. Review of probabilistic analysis of dynamic response of systems with random parameters

    NASA Technical Reports Server (NTRS)

    Kozin, F.; Klosner, J. M.

    1989-01-01

    The various methods that have been studied in the past to allow probabilistic analysis of dynamic response for systems with random parameters are reviewed. Dynamic response may have been obtained deterministically if the variations about the nominal values were small; however, for space structures which require precise pointing, the variations about the nominal values of the structural details and of the environmental conditions are too large to be considered as negligible. These uncertainties are accounted for in terms of probability distributions about their nominal values. The quantities of concern for describing the response of the structure includes displacements, velocities, and the distributions of natural frequencies. The exact statistical characterization of the response would yield joint probability distributions for the response variables. Since the random quantities will appear as coefficients, determining the exact distributions will be difficult at best. Thus, certain approximations will have to be made. A number of techniques that are available are discussed, even in the nonlinear case. The methods that are described were: (1) Liouville's equation; (2) perturbation methods; (3) mean square approximate systems; and (4) nonlinear systems with approximation by linear systems.

  5. State Tracking and Fault Diagnosis for Dynamic Systems Using Labeled Uncertainty Graph.

    PubMed

    Zhou, Gan; Feng, Wenquan; Zhao, Qi; Zhao, Hongbo

    2015-11-05

    Cyber-physical systems such as autonomous spacecraft, power plants and automotive systems become more vulnerable to unanticipated failures as their complexity increases. Accurate tracking of system dynamics and fault diagnosis are essential. This paper presents an efficient state estimation method for dynamic systems modeled as concurrent probabilistic automata. First, the Labeled Uncertainty Graph (LUG) method in the planning domain is introduced to describe the state tracking and fault diagnosis processes. Because the system model is probabilistic, the Monte Carlo technique is employed to sample the probability distribution of belief states. In addition, to address the sample impoverishment problem, an innovative look-ahead technique is proposed to recursively generate most likely belief states without exhaustively checking all possible successor modes. The overall algorithms incorporate two major steps: a roll-forward process that estimates system state and identifies faults, and a roll-backward process that analyzes possible system trajectories once the faults have been detected. We demonstrate the effectiveness of this approach by applying it to a real world domain: the power supply control unit of a spacecraft.

  6. Probabilistic Multi-Person Tracking Using Dynamic Bayes Networks

    NASA Astrophysics Data System (ADS)

    Klinger, T.; Rottensteiner, F.; Heipke, C.

    2015-08-01

    Tracking-by-detection is a widely used practice in recent tracking systems. These usually rely on independent single frame detections that are handled as observations in a recursive estimation framework. If these observations are imprecise the generated trajectory is prone to be updated towards a wrong position. In contrary to existing methods our novel approach uses a Dynamic Bayes Network in which the state vector of a recursive Bayes filter, as well as the location of the tracked object in the image are modelled as unknowns. These unknowns are estimated in a probabilistic framework taking into account a dynamic model, and a state-of-the-art pedestrian detector and classifier. The classifier is based on the Random Forest-algorithm and is capable of being trained incrementally so that new training samples can be incorporated at runtime. This allows the classifier to adapt to the changing appearance of a target and to unlearn outdated features. The approach is evaluated on a publicly available benchmark. The results confirm that our approach is well suited for tracking pedestrians over long distances while at the same time achieving comparatively good geometric accuracy.

  7. A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.

    PubMed

    Kazemi, Reza; Mosleh, Ali; Dierks, Meghan

    2017-03-01

    In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations between variables and contain historical information about their relationship, and are powerful tools for modeling causes and effects in many domains. The model is intended to support hospital decisions with regard to staffing, length of stay, and investments in safety, which evolve dynamically over time. The methodology has been applied in modeling the two types of common AEs: pressure ulcers and vascular-catheter-associated infection, and the models have been validated with eight years of clinical data and use of expert opinion. © 2017 Society for Risk Analysis.

  8. A Nonlinear Dynamics Approach for Incorporating Wind-Speed Patterns into Wind-Power Project Evaluation

    PubMed Central

    Huffaker, Ray; Bittelli, Marco

    2015-01-01

    Wind-energy production may be expanded beyond regions with high-average wind speeds (such as the Midwest U.S.A.) to sites with lower-average speeds (such as the Southeast U.S.A.) by locating favorable regional matches between natural wind-speed and energy-demand patterns. A critical component of wind-power evaluation is to incorporate wind-speed dynamics reflecting documented diurnal and seasonal behavioral patterns. Conventional probabilistic approaches remove patterns from wind-speed data. These patterns must be restored synthetically before they can be matched with energy-demand patterns. How to accurately restore wind-speed patterns is a vexing problem spurring an expanding line of papers. We propose a paradigm shift in wind power evaluation that employs signal-detection and nonlinear-dynamics techniques to empirically diagnose whether synthetic pattern restoration can be avoided altogether. If the complex behavior of observed wind-speed records is due to nonlinear, low-dimensional, and deterministic system dynamics, then nonlinear dynamics techniques can reconstruct wind-speed dynamics from observed wind-speed data without recourse to conventional probabilistic approaches. In the first study of its kind, we test a nonlinear dynamics approach in an application to Sugarland Wind—the first utility-scale wind project proposed in Florida, USA. We find empirical evidence of a low-dimensional and nonlinear wind-speed attractor characterized by strong temporal patterns that match up well with regular daily and seasonal electricity demand patterns. PMID:25617767

  9. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  10. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  11. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  12. Shuttle Risk Progression by Flight

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri; Kahn, Joe; Thigpen, Eric; Zhu, Tony; Lo, Yohon

    2011-01-01

    Understanding the early mission risk and progression of risk as a vehicle gains insights through flight is important: . a) To the Shuttle Program to understand the impact of re-designs and operational changes on risk. . b) To new programs to understand reliability growth and first flight risk. . Estimation of Shuttle Risk Progression by flight: . a) Uses Shuttle Probabilistic Risk Assessment (SPRA) and current knowledge to calculate early vehicle risk. . b) Shows impact of major Shuttle upgrades. . c) Can be used to understand first flight risk for new programs.

  13. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.

  14. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  15. Risk assessment of CST-7 proposed waste treatment and storage facilities Volume I: Limited-scope probabilistic risk assessment (PRA) of proposed CST-7 waste treatment & storage facilities. Volume II: Preliminary hazards analysis of proposed CST-7 waste storage & treatment facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, K.

    1994-06-01

    In FY 1993, the Los Alamos National Laboratory Waste Management Group [CST-7 (formerly EM-7)] requested the Probabilistic Risk and Hazards Analysis Group [TSA-11 (formerly N-6)] to conduct a study of the hazards associated with several CST-7 facilities. Among these facilities are the Hazardous Waste Treatment Facility (HWTF), the HWTF Drum Storage Building (DSB), and the Mixed Waste Receiving and Storage Facility (MWRSF), which are proposed for construction beginning in 1996. These facilities are needed to upgrade the Laboratory`s storage capability for hazardous and mixed wastes and to provide treatment capabilities for wastes in cases where offsite treatment is not availablemore » or desirable. These facilities will assist Los Alamos in complying with federal and state requlations.« less

  16. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.

    PubMed

    Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-02-24

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.

  17. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  18. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.

  19. ECOFRAM Terrestrial Draft Report

    EPA Pesticide Factsheets

    ECOFRAM report describing the concepts for moving from deterministic to probabilistic ecological risk assessments. ECOFRAM included scientific experts from government, academia, contract laboratories, environmental advocacy groups and industry.

  20. Safety Issues with Hydrogen as a Vehicle Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, Lee Charles; Herring, James Stephen

    1999-10-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less

  1. Safety Issues with Hydrogen as a Vehicle Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. C. Cadwallader; J. S. Herring

    1999-09-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less

  2. Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks.

    PubMed

    Saad, E W; Prokhorov, D V; Wunsch, D C

    1998-01-01

    Three networks are compared for low false alarm stock trend predictions. Short-term trends, particularly attractive for neural network analysis, can be used profitably in scenarios such as option trading, but only with significant risk. Therefore, we focus on limiting false alarms, which improves the risk/reward ratio by preventing losses. To predict stock trends, we exploit time delay, recurrent, and probabilistic neural networks (TDNN, RNN, and PNN, respectively), utilizing conjugate gradient and multistream extended Kalman filter training for TDNN and RNN. We also discuss different predictability analysis techniques and perform an analysis of predictability based on a history of daily closing price. Our results indicate that all the networks are feasible, the primary preference being one of convenience.

  3. Enhancing Cost Realism through Risk-Driven Contracting: Designing Incentive Fees Based on Probabilistic Cost Estimates

    DTIC Science & Technology

    2012-04-01

    Comparison of Management Practices in the Army, Navy, and Air Force 142Defense ARJ, April 2012, Vol. 19 No. 2 : 133–160 It appears the pendulum may be...the cost risk for requiring greater innovation. However, this natural flattening trend also leads to a potential drawback of the risk-driven

  4. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritterbusch, Stanley; Golay, Michael; Duran, Felicia

    2003-01-29

    OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.

  5. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  6. Characterizing Wildfire Regimes and Risk in the USA

    NASA Astrophysics Data System (ADS)

    Malamud, B. D.; Millington, J. D.; Perry, G. L.

    2004-12-01

    Over the last decade, high profile wildfires have resulted in numerous fatalities and loss of infrastructure. Wildfires also have a significant impact on climate and ecosystems, with recent authors emphasizing the need for regional-level examinations of wildfire-regime dynamics and change, and the factors driving them. With implications for hazard management, climate studies, and ecosystem research, there is therefore significant interest in appropriate analysis of historical wildfire databases. Insightful studies using wildfire database statistics exist, but are often hampered by the low spatial and/or temporal resolution of their datasets. In this paper, we use a high-resolution dataset consisting of 88,855 USFS wildfires over the time period 1970--2000, and consider wildfire occurrence across the conterminous USA as a function of ecoregion (land units classified by climate, vegetation, and topography), ignition source (anthropogenic vs. lightning), and decade (1970--1979, 1980--1989, 1990--1999). We find that for the conterminous USA (a) wildfires exhibit robust frequency-area power-law behavior in 17 different ecoregions, (b) normalized power-law exponents may be used to compare the scaling of wildfire burned areas between regions, (c) power-law exponents change systematically from east to west, (d) wildfires in 75% of the conterminous USA (particularly the east) have higher power-law exponents for anthropogenic vs. lightning ignition sources, and (e) recurrence intervals for wildfires of a given burned area or larger for each ecoregion can be assessed, allowing for the classification of wildfire regimes for probabilistic hazard estimation in the same vein as is now used for earthquakes. By examining wildfire statistics in a spatially and temporally explicit manner, we are able to present resultant wildfire regime summary statistics and conclusions, along with a probabilistic hazard assessment of wildfire risk at the ecoregion division level across the conterminous USA.

  7. Work-in-Progress Presented at the Army Symposium on Solid Mechanics, 1980 - Designing for Extremes: Environment, Loading, and Structural Behavior Held at Cape Cod, Massachusetts, 29 September-2 October 1980

    DTIC Science & Technology

    1980-09-01

    relating x’and y’ Figure 2: Basic Laboratory Simulation Model 73 COMPARISON OF COMPUTED AND MEASURED ACCELERATIONS IN A DYNAMICALLY LOADED TACTICAL...Survival (General) Displacements Mines (Ordnance) Telemeter Systems Dynamic Response Models Temperatures Dynamics Moisture Thermal Stresses Energy...probabilistic reliability model for the XM 753 projectile rocket motor to bulkhead joint under extreme loading conditions is constructed. The reliability

  8. A Survey of Probabilistic Methods for Dynamical Systems with Uncertain Parameters.

    DTIC Science & Technology

    1986-05-01

    J., "An Approach to the Theoretical Background of Statistical Energy Analysis Applied to Structural Vibration," Journ. Acoust. Soc. Amer., Vol. 69...1973, Sect. 8.3. 80. Lyon, R.H., " Statistical Energy Analysis of Dynamical Systems," M.I.T. Press, 1975. e) Late References added in Proofreading !! 81...Dowell, E.H., and Kubota, Y., "Asymptotic Modal Analysis and ’~ y C-" -165- Statistical Energy Analysis of Dynamical Systems," Journ. Appi. - Mech

  9. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance relative to the current allocation process is assessed in the context of whether such a model could support the proposed short-term contract based participatory process. A synthetic forecasting example is also used to explore the relative roles of forecast skill and reservoir storage in this framework.

  10. A screening level probabilistic ecological risk assessment of PAHs in sediments of San Francisco Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Febbo, E.J.; Arnold, W.R.; Biddinger, G.R.

    1995-12-31

    As part of the Regional Monitoring Program administered by the San Francisco Estuary Institute (SFEI), sediment samples were collected at 20 stations in San Francisco Bay and analyzed to determine concentrations of 43 PAHs. These data were obtained from SFEI and used to calculate the potential risk to aquatic organisms using probabilistic modeling and Monte Carlo statistical procedures. Sediment chemistry data were used in conjunction with a sediment equilibrium model, a bioconcentration model, biota-sediment accumulation factors, and critical body burden effects concentrations to assess potential risk to bivalves. Bivalves were the chosen receptors because they lack a well-developed enzymatic systemmore » for metabolizing PAHs. Thus, they more readily accumulate PAHs and represent a species at greater risk than other taxa, such as fish and crustaceans. PAHs considered in this study span a broad range of octanol-water partition coefficients. Results indicate that risk of non-polar narcotic effects from PAHs was low in the Northern Bay Area, but higher in the South Bay near the more urbanized sections of the drainage basin.« less

  11. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  12. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.

  13. To Tip or Not to Tip: The Case of the Congo Basin Rainforest Realm

    NASA Astrophysics Data System (ADS)

    Pietsch, S.; Bednar, J. E.; Fath, B. D.; Winter, P. A.

    2017-12-01

    The future response of the Congo basin rainforest, the second largest tropical carbon reservoir, to climate change is still under debate. Different Climate projections exist stating increase and decrease in rainfall and different changes in rainfall patterns. Within this study we assess all options of climate change possibilities to define the climatic thresholds of Congo basin rainforest stability and assess the limiting conditions for rainforest persistence. We use field data from 199 research plots from the Western Congo basin to calibrate and validate a complex BioGeoChemistry model (BGC-MAN) and assess model performance against an array of possible future climates. Next, we analyze the reasons for the occurrence of tipping points, their spatial and temporal probability of occurrence, will present effects of hysteresis and derive probabilistic spatial-temporal resilience landscapes for the region. Additionally, we will analyze attractors of forest growth dynamics and assess common linear measures for early warning signals of sudden shifts in system dynamics for their robustness in the context of the Congo Basin case, and introduce the correlation integral as a nonlinear measure of risk assessment.

  14. Estimating and controlling workplace risk: an approach for occupational hygiene and safety professionals.

    PubMed

    Toffel, Michael W; Birkner, Lawrence R

    2002-07-01

    The protection of people and physical assets is the objective of health and safety professionals and is accomplished through the paradigm of anticipation, recognition, evaluation, and control of risks in the occupational environment. Risk assessment concepts are not only used by health and safety professionals, but also by business and financial planners. Since meeting health and safety objectives requires financial resources provided by business and governmental managers, the hypothesis addressed here is that health and safety risk decisions should be made with probabilistic processes used in financial decision-making and which are familiar and recognizable to business and government planners and managers. This article develops the processes and demonstrates the use of incident probabilities, historic outcome information, and incremental impact analysis to estimate risk of multiple alternatives in the chemical process industry. It also analyzes how the ethical aspects of decision-making can be addressed in formulating health and safety risk management plans. It is concluded that certain, easily understood, and applied probabilistic risk assessment methods used by business and government to assess financial and outcome risk have applicability to improving workplace health and safety in three ways: 1) by linking the business and health and safety risk assessment processes to securing resources, 2) by providing an additional set of tools for health and safety risk assessment, and 3) by requiring the risk assessor to consider multiple risk management alternatives.

  15. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    NASA Astrophysics Data System (ADS)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  16. Beyond the “at risk mental state” concept: transitioning to transdiagnostic psychiatry

    PubMed Central

    McGorry, Patrick D.; Hartmann, Jessica A.; Spooner, Rachael; Nelson, Barnaby

    2018-01-01

    The “at risk mental state” for psychosis approach has been a catalytic, highly productive research paradigm over the last 25 years. In this paper we review that paradigm and summarize its key lessons, which include the valence of this phenotype for future psychosis outcomes, but also for comorbid, persistent or incident non‐psychotic disorders; and the evidence that onset of psychotic disorder can at least be delayed in ultra high risk (UHR) patients, and that some full‐threshold psychotic disorder may emerge from risk states not captured by UHR criteria. The paradigm has also illuminated risk factors and mechanisms involved in psychosis onset. However, findings from this and related paradigms indicate the need to develop new identification and diagnostic strategies. These findings include the high prevalence and impact of mental disorders in young people, the limitations of current diagnostic systems and risk identification approaches, the diffuse and unstable symptom patterns in early stages, and their pluripotent, transdiagnostic trajectories. The approach we have recently adopted has been guided by the clinical staging model and adapts the original “at risk mental state” approach to encompass a broader range of inputs and output target syndromes. This approach is supported by a number of novel modelling and prediction strategies that acknowledge and reflect the dynamic nature of psychopathology, such as dynamical systems theory, network theory, and joint modelling. Importantly, a broader transdiagnostic approach and enhancing specific prediction (profiling or increasing precision) can be achieved concurrently. A holistic strategy can be developed that applies these new prediction approaches, as well as machine learning and iterative probabilistic multimodal models, to a blend of subjective psychological data, physical disturbances (e.g., EEG measures) and biomarkers (e.g., neuroinflammation, neural network abnormalities) acquired through fine‐grained sequential or longitudinal assessments. This strategy could ultimately enhance our understanding and ability to predict the onset, early course and evolution of mental ill health, further opening pathways for preventive interventions. PMID:29856558

  17. MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Tarquini, Simone

    2018-01-01

    A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.

  18. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  19. Bayesian Monte Carlo and Maximum Likelihood Approach for Uncertainty Estimation and Risk Management: Application to Lake Oxygen Recovery Model

    EPA Science Inventory

    Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...

  20. A systematic risk management approach employed on the CloudSat project

    NASA Technical Reports Server (NTRS)

    Basilio, R. R.; Plourde, K. S.; Lam, T.

    2000-01-01

    The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.

  1. A PROBABALISTIC ANALYSIS TO DETERMINE ECOLOGICAL RISK DRIVERS, 10TH VOLUME ASTM STP 1403

    EPA Science Inventory

    A probabilistic analysis of exposure and effect data was used to identify chemicals most likely responsible for ecological risk. The mean and standard deviation of the natural log-transformed chemical data were used to estimate the probability of exposure for an area of concern a...

  2. Minimizing risks from spilled oil to ecosystem services using influence diagrams: The Deepwater Horizon spill response

    EPA Science Inventory

    Making inferences on risks to ecosystem services (ES) from ecological crises may be improved using decision science tools. Influence diagrams (IDs) are probabilistic networks that explicitly represent the decisions related to a problem and evidence of their influence on desired o...

  3. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.

  4. Development and application of a probabilistic method for wildfire suppression cost modeling

    Treesearch

    Matthew P. Thompson; Jessica R. Haas; Mark A. Finney; David E. Calkin; Michael S. Hand; Mark J. Browne; Martin Halek; Karen C. Short; Isaac C. Grenfell

    2015-01-01

    Wildfire activity and escalating suppression costs continue to threaten the financial health of federal land management agencies. In order to minimize and effectively manage the cost of financial risk, agencies need the ability to quantify that risk. A fundamental aim of this research effort, therefore, is to develop a process for generating risk-based metrics for...

  5. Probalistic Assessment of Radiation Risk for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2008-01-01

    For long duration missions outside of the protection of the Earth's magnetic field, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon or Earth-to-Mars transit. The large majority (90%) of SPEs have small or no health consequences because the doses are low and the particles do not penetrate to organ depths. However, there is an operational challenge to respond to events of unknown size and duration. We have developed a probabilistic approach to SPE risk assessment in support of mission design and operational planning. Using the historical database of proton measurements during the past 5 solar cycles, the functional form of hazard function of SPE occurrence per cycle was found for nonhomogeneous Poisson model. A typical hazard function was defined as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions of particle fluences for a specified mission period were simulated ranging from its 5th to 95th percentile. Organ doses from large SPEs were assessed using NASA's Baryon transport model, BRYNTRN. The SPE risk was analyzed with the organ dose distribution for the given particle fluences during a mission period. In addition to the total particle fluences of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the cancer risk associated with energetic particles for large events. The probability of exceeding the NASA 30-day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated for various SPE sizes. This probabilistic approach to SPE protection will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks in future work.

  6. Flood Risk and Asset Management

    DTIC Science & Technology

    2012-09-01

    use by third parties of results or methods presented in this report. The Company also stresses that various sections of this report rely on data...inundation probability  Levee contribution to risk The methods used in FRE have been applied to establish the National Flood Risk in England and...be noted that when undertaking high level probabilistic risk assessments in the UK, if a defence’s condition is unknown, grade 3 is applied with

  7. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  8. Comparing probabilistic microbial risk assessments for drinking water against daily rather than annualised infection probability targets.

    PubMed

    Signor, R S; Ashbolt, N J

    2009-12-01

    Some national drinking water guidelines provide guidance on how to define 'safe' drinking water. Regarding microbial water quality, a common position is that the chance of an individual becoming infected by some reference waterborne pathogen (e.g. Cryptsporidium) present in the drinking water should < 10(-4) in any year. However the instantaneous levels of risk to a water consumer vary over the course of a year, and waterborne disease outbreaks have been associated with shorter-duration periods of heightened risk. Performing probabilistic microbial risk assessments is becoming commonplace to capture the impacts of temporal variability on overall infection risk levels. A case is presented here for adoption of a shorter-duration reference period (i.e. daily) infection probability target over which to assess, report and benchmark such risks. A daily infection probability benchmark may provide added incentive and guidance for exercising control over short-term adverse risk fluctuation events and their causes. Management planning could involve outlining measures so that the daily target is met under a variety of pre-identified event scenarios. Other benefits of a daily target could include providing a platform for managers to design and assess management initiatives, as well as simplifying the technical components of the risk assessment process.

  9. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  10. Probabalistic Risk Assessment of a Turbine Disk

    NASA Astrophysics Data System (ADS)

    Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted

    Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.

  11. Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Dinther, Y.; Kuensch, H. R.

    2017-12-01

    Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct added value with respect to using observations or numerical models separately. Although several challenges for applications to a natural setting remain, these first results indicate the large potential of data assimilation techniques for probabilistic seismic hazard assessment and other challenges in dynamic solid earth systems.

  12. On problems of analyzing aerodynamic properties of blunted rotary bodies with small random surface distortions under supersonic and hypersonic flows

    NASA Astrophysics Data System (ADS)

    Degtyar, V. G.; Kalashnikov, S. T.; Mokin, Yu. A.

    2017-10-01

    The paper considers problems of analyzing aerodynamic properties (ADP) of reenetry vehicles (RV) as blunted rotary bodies with small random surface distortions. The interactions of math simulation of surface distortions, selection of tools for predicting ADPs of shaped bodies, evaluation of different-type ADP variations and their adaptation for dynamic problems are analyzed. The possibilities of deterministic and probabilistic approaches to evaluation of ADP variations are considered. The practical value of the probabilistic approach is demonstrated. The examples of extremal deterministic evaluations of ADP variations for a sphere and a sharp cone are given.

  13. Using probabilistic terrorism risk modeling for regulatory benefit-cost analysis: application to the Western hemisphere travel initiative in the land environment.

    PubMed

    Willis, Henry H; LaTourrette, Tom

    2008-04-01

    This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.

  14. Cholera and shigellosis in Bangladesh: similarities and differences in population dynamics under climate forcing

    NASA Astrophysics Data System (ADS)

    Pascual, M.; Cash, B.; Reiner, R.; King, A.; Emch, M.; Yunus, M.; Faruque, A. S.

    2012-12-01

    The influence of climate variability on the population dynamics of infectious diseases is considered a large scale, regional, phenomenon, and as such, has been previously addressed for cholera with temporal models that do not incorporate fine-scale spatial structure. In our previous work, evidence for a role of ENSO (El Niño Southern Oscillation) on cholera in Bangladesh was elucidated, and shown to influence the regional climate through precipitation. With a probabilistic spatial model for cholera dynamics in the megacity of Dhaka, we found that the action of climate variability (ENSO and flooding) is localized: there is a climate-sensitive urban core that acts to propagate risk to the rest of the city. Here, we consider long-term surveillance data for shigellosis, another diarrheal disease that coexists with cholera in Bangladesh. We compare the patterns of association with climate variables for these two diseases in a rural setting, as well as the spatial structure in their spatio-temporal dynamics in an urban one. Evidence for similar patterns is presented, and discussed in the context of the differences in the routes of transmission of the two diseases and the proposed role of an environmental reservoir in cholera. The similarities provide evidence for a more general influence of hydrology and of socio-economic factors underlying human susceptibility and sanitary conditions.

  15. Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunsman, David Marvin; Aldemir, Tunc; Rutt, Benjamin

    2008-05-01

    This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accidentmore » progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other such codes.) The results of this demonstration indicate that the approach can significantly reduce the resources required for Level 2 PRAs. From the phenomenological viewpoint, ADAPT can also treat the associated epistemic and aleatory uncertainties. This methodology can also be used for analyses of other complex systems. Any complex system can be analyzed using ADAPT if the workings of that system can be displayed as an event tree, there is a computer code that simulates how those events could progress, and that simulator code has switches to turn on and off system events, phenomena, etc. Using and applying ADAPT to particular problems is not human independent. While the human resources for the creation and analysis of the accident progression are significantly decreased, knowledgeable analysts are still necessary for a given project to apply ADAPT successfully. This research and development effort has met its original goals and then exceeded them.« less

  16. Prioritization Risk Integration Simulation Model (PRISM) For Environmental Remediation and Waste Management - 12097

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pentz, David L.; Stoll, Ralph H.; Greeves, John T.

    2012-07-01

    The PRISM (Prioritization Risk Integration Simulation Model), a computer model was developed to support the Department of Energy's Office of Environmental Management (DOE-EM) in its mission to clean up the environmental legacy from the Nation's nuclear weapons materials production complex. PRISM provides a comprehensive, fully integrated planning tool that can tie together DOE-EM's projects. It is designed to help DOE managers develop sound, risk-informed business practices and defend program decisions. It provides a better ability to understand and manage programmatic risks. The underlying concept for PRISM is that DOE-EM 'owns' a portfolio of environmental legacy obligations (ELOs), and that itsmore » mission is to transform the ELOs from their current conditions to acceptable conditions, in the most effective way possible. There are many types of ELOs - - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. For a given suite of planned activities, PRISM simulates the outcomes as they play out over time, allowing for all key identified uncertainties and risk factors. Each contaminated building, land area and waste stream is tracked from cradle to grave, and all of the linkages affecting different waste streams are captured. The progression of the activities is fully dynamic, reflecting DOE-EM's prioritization approaches, precedence requirements, available funding, and the consequences of risks and uncertainties. The top level of PRISM is the end-user interface that allows rapid evaluation of alternative scenarios and viewing the results in a variety of useful ways. PRISM is a fully probabilistic model, allowing the user to specify uncertainties in input data (such as the magnitude of an existing groundwater plume, or the total cost to complete a planned activity) as well as specific risk events that might occur. PRISM is based on the GoldSim software that is widely used for risk and performance assessment calculations. PRISM can be run in a deterministic mode, which quickly provides an estimate of the most likely results of a given plan. Alternatively, the model can be run probabilistically in a Monte Carlo mode, exploring the risks and uncertainties in the system and producing probability distributions for the different performance measures. The PRISM model demonstrates how EM can evaluate a portfolio of ELOs, and transform the ELOs from their current conditions to acceptable conditions, utilizing different strategic approaches. There are many types of ELOs - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. This scope of work for the PRISM process and the development of a dynamic simulation model are a logical extension of the GoldSim simulation software used by the OCRWM to assess the long-term performance for the Yucca Mountain Project and by NNSA to assess project risk at its sites. Systems integration modeling will promote better understanding of all project risks, technical and nontechnical, and more defensible decision-making for complex projects with significant uncertainties. It can provide effective visual communication and rapid adaptation during interactions with stakeholders (Administration, Congress, State, Local, and NGO). It will also allow rapid assessment of alternative management approaches. (authors)« less

  17. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  18. Reconciling Streamflow Uncertainty Estimation and River Bed Morphology Dynamics. Insights from a Probabilistic Assessment of Streamflow Uncertainties Using a Reliability Diagram

    NASA Astrophysics Data System (ADS)

    Morlot, T.; Mathevet, T.; Perret, C.; Favre Pugin, A. C.

    2014-12-01

    Streamflow uncertainty estimation has recently received a large attention in the literature. A dynamic rating curve assessment method has been introduced (Morlot et al., 2014). This dynamic method allows to compute a rating curve for each gauging and a continuous streamflow time-series, while calculating streamflow uncertainties. Streamflow uncertainty takes into account many sources of uncertainty (water level, rating curve interpolation and extrapolation, gauging aging, etc.) and produces an estimated distribution of streamflow for each days. In order to caracterise streamflow uncertainty, a probabilistic framework has been applied on a large sample of hydrometric stations of the Division Technique Générale (DTG) of Électricité de France (EDF) hydrometric network (>250 stations) in France. A reliability diagram (Wilks, 1995) has been constructed for some stations, based on the streamflow distribution estimated for a given day and compared to a real streamflow observation estimated via a gauging. To build a reliability diagram, we computed the probability of an observed streamflow (gauging), given the streamflow distribution. Then, the reliability diagram allows to check that the distribution of probabilities of non-exceedance of the gaugings follows a uniform law (i.e., quantiles should be equipropables). Given the shape of the reliability diagram, the probabilistic calibration is caracterised (underdispersion, overdispersion, bias) (Thyer et al., 2009). In this paper, we present case studies where reliability diagrams have different statistical properties for different periods. Compared to our knowledge of river bed morphology dynamic of these hydrometric stations, we show how reliability diagram gives us invaluable information on river bed movements, like a continuous digging or backfilling of the hydraulic control due to erosion or sedimentation processes. Hence, the careful analysis of reliability diagrams allows to reconcile statistics and long-term river bed morphology processes. This knowledge improves our real-time management of hydrometric stations, given a better caracterisation of erosion/sedimentation processes and the stability of hydrometric station hydraulic control.

  19. Deep Uncertainty Surrounding Coastal Flood Risk Projections: A Case Study for New Orleans

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Keller, Klaus

    2017-10-01

    Future sea-level rise drives severe risks for many coastal communities. Strategies to manage these risks hinge on a sound characterization of the uncertainties. For example, recent studies suggest that large fractions of the Antarctic ice sheet (AIS) may rapidly disintegrate in response to rising global temperatures, leading to potentially several meters of sea-level rise during the next few centuries. It is deeply uncertain, for example, whether such an AIS disintegration will be triggered, how much this would increase sea-level rise, whether extreme storm surges intensify in a warming climate, or which emissions pathway future societies will choose. Here, we assess the impacts of these deep uncertainties on projected flooding probabilities for a levee ring in New Orleans, LA. We use 18 scenarios, presenting probabilistic projections within each one, to sample key deeply uncertain future projections of sea-level rise, radiative forcing pathways, storm surge characterization, and contributions from rapid AIS mass loss. The implications of these deep uncertainties for projected flood risk are thus characterized by a set of 18 probability distribution functions. We use a global sensitivity analysis to assess which mechanisms contribute to uncertainty in projected flood risk over the course of a 50-year design life. In line with previous work, we find that the uncertain storm surge drives the most substantial risk, followed by general AIS dynamics, in our simple model for future flood risk for New Orleans.

  20. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  1. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  2. Probabilistic framework for the estimation of the adult and child toxicokinetic intraspecies uncertainty factors.

    PubMed

    Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S

    2003-12-01

    Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.

  3. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  4. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  5. 75 FR 78777 - Advisory Committee On Reactor Safeguards; Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... Committee includes individuals experienced in reactor operations, management; probabilistic risk assessment...: December 10, 2010. Andrew L. Bates, Advisory Committee Management Officer. [FR Doc. 2010-31590 Filed 12-15...

  6. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  7. Estimating the risk of Amazonian forest dieback.

    PubMed

    Rammig, Anja; Jupp, Tim; Thonicke, Kirsten; Tietjen, Britta; Heinke, Jens; Ostberg, Sebastian; Lucht, Wolfgang; Cramer, Wolfgang; Cox, Peter

    2010-08-01

    *Climate change will very likely affect most forests in Amazonia during the course of the 21st century, but the direction and intensity of the change are uncertain, in part because of differences in rainfall projections. In order to constrain this uncertainty, we estimate the probability for biomass change in Amazonia on the basis of rainfall projections that are weighted by climate model performance for current conditions. *We estimate the risk of forest dieback by using weighted rainfall projections from 24 general circulation models (GCMs) to create probability density functions (PDFs) for future forest biomass changes simulated by a dynamic vegetation model (LPJmL). *Our probabilistic assessment of biomass change suggests a likely shift towards increasing biomass compared with nonweighted results. Biomass estimates range between a gain of 6.2 and a loss of 2.7 kg carbon m(-2) for the Amazon region, depending on the strength of CO(2) fertilization. *The uncertainty associated with the long-term effect of CO(2) is much larger than that associated with precipitation change. This underlines the importance of reducing uncertainties in the direct effects of CO(2) on tropical ecosystems.

  8. Dynamic Models Applied to Landslides: Study Case Angangueo, MICHOACÁN, MÉXICO.

    NASA Astrophysics Data System (ADS)

    Torres Fernandez, L.; Hernández Madrigal, V. M., , Dr; Capra, L.; Domínguez Mota, F. J., , Dr

    2017-12-01

    Most existing models for landslide zonification are static type, do not consider the dynamic behavior of the trigger factor. This results in a limited representation of the actual zonation of slope instability, present a short-term validity, cańt be applied for the design of early warning systems, etc. Particularly in Mexico, these models are static because they do not consider triggering factor such as precipitation. In this work, we present a numerical evaluation to know the landslide susceptibility, based on probabilistic methods. Which are based on the generation of time series, which are generated from the meteorological stations, having limited information an interpolation is made to generate the simulation of the precipitation in the zone. The obtained information is integrated in PCRaster and in conjunction with the conditioning factors it is possible to generate a dynamic model. This model will be applied for landslide zoning in the municipality of Angangueo, characterized by frequent logging of debris and mud flow, translational and rotational landslides, detonated by atypical precipitations, such as those recorded in 2010. These caused economic losses and humans. With these models, it would be possible to generate probable scenarios that help the Angangueo's population to reduce the risks and to carry out actions of constant resilience activities.

  9. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  10. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  11. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  12. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.

  13. Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions

    NASA Technical Reports Server (NTRS)

    Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.

    2014-01-01

    Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.

  14. Cancer risk from incidental ingestion exposures to PAHs associated with coal-tar-sealed pavement

    USGS Publications Warehouse

    Williams, E. Spencer; Mahler, Barbara J.; Van Metre, Peter C.

    2012-01-01

    Recent (2009-10) studies documented significantly higher concentrations of polycyclic aromatic hydrocarbons (PAHs) in settled house dust in living spaces and soil adjacent to parking lots sealed with coal-tar-based products. To date, no studies have examined the potential human health effects of PAHs from these products in dust and soil. Here we present the results of an analysis of potential cancer risk associated with incidental ingestion exposures to PAHs in settings near coal-tar-sealed pavement. Exposures to benzo[a]pyrene equivalents were characterized across five scenarios. The central tendency estimate of excess cancer risk resulting from lifetime exposures to soil and dust from nondietary ingestion in these settings exceeded 1 × 10–4, as determined using deterministic and probabilistic methods. Soil was the primary driver of risk, but according to probabilistic calculations, reasonable maximum exposure to affected house dust in the first 6 years of life was sufficient to generate an estimated excess lifetime cancer risk of 6 × 10–5. Our results indicate that the presence of coal-tar-based pavement sealants is associated with significant increases in estimated excess lifetime cancer risk for nearby residents. Much of this calculated excess risk arises from exposures to PAHs in early childhood (i.e., 0–6 years of age).

  15. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    NASA Technical Reports Server (NTRS)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of magnitude less computational effort. Both free- and forced-response analyses have been performed, and the results indicate that, while there is considerable room for improvement, the method produces usable and more representative solutions for the design of realistic structures with a substantial savings in computer time.

  16. Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires

    Treesearch

    P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner

    2011-01-01

    The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...

  17. Spatially explicit risk assessment of an estuarine fish in Barataria Bay, Louisiana, following the Deepwater Horizon Oil spill: evaluating tradeoffs in model complexity and parsimony

    EPA Science Inventory

    As ecological risk assessments (ERA) move beyond organism-based determinations towards probabilistic population-level assessments, model complexity must be evaluated against the goals of the assessment, the information available to parameterize components with minimal dependence ...

  18. 77 FR 38856 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... discussion on defense-in-depth. Specifically, the SRM stated, Because the statements in Regulatory Guide 1... language to assure that the defense-in-depth philosophy is interpreted and implemented consistently. To the extent that other regulatory guidance refers to defense in depth, the relevant documents should be...

  19. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... revise the discussion on defense-in-depth. Specifically, the SRM stated, Because the statements in... precise language to assure that the defense-in-depth philosophy is interpreted and implemented consistently. To the extent that other regulatory guidance refers to defense in depth, the relevant documents...

  20. Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yau, M.; Motamed, M.; Guarro, S.

    2006-07-01

    Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less

  1. Impaired risk evaluation in people with Internet gaming disorder: fMRI evidence from a probability discounting task.

    PubMed

    Lin, Xiao; Zhou, Hongli; Dong, Guangheng; Du, Xiaoxia

    2015-01-02

    This study examined how Internet gaming disorder (IGD) subjects modulating reward and risk at a neural level under a probability-discounting task with functional magnetic resonance imaging (fMRI). Behavioral and imaging data were collected from 19 IGD subjects (22.2 ± 3.08 years) and 21 healthy controls (HC, 22.8 ± 3.5 years). Behavior results showed that IGD subjects prefer the probabilistic options to fixed ones and were associated with shorter reaction time, when comparing to HC. The fMRI results revealed that IGD subjects show decreased activation in the inferior frontal gyrus and the precentral gyrus when choosing the probabilistic options than HC. Correlations were also calculated between behavioral performances and brain activities in relevant brain regions. Both of the behavioral performance and fMRI results indicate that people with IGD show impaired risk evaluation, which might be the reason why IGD subjects continue playing online games despite the risks of widely known negative consequence. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Prediction of the Arctic Oscillation in Boreal Winter by Dynamical Seasonal Forecasting Systems

    NASA Technical Reports Server (NTRS)

    Kang, Daehyun; Lee, Myong-In; Im, Jungho; Kim, Daehyun; Kim, Hye-Mi; Kang, Hyun-Suk; Shubert, Siegfried D.; Arriba, Albertom; MacLachlan, Craig

    2013-01-01

    This study assesses the prediction skill of the boreal winter Arctic Oscillation (AO) in the state-of-the-art dynamical ensemble prediction systems (EPSs): the UKMO GloSea4, the NCEP CFSv2, and the NASA GEOS-5. Long-term reforecasts made with the EPSs are used to evaluate representations of the AO, and to examine skill scores for the deterministic and probabilistic forecast of the AO index. The reforecasts reproduce the observed changes in the large-scale patterns of the Northern Hemispheric surface temperature, upper-level wind, and precipitation according to the AO phase. Results demonstrate that all EPSs have better prediction skill than the persistence prediction for lead times up to 3-month, suggesting a great potential for skillful prediction of the AO and the associated climate anomalies in seasonal time scale. It is also found that the deterministic and probabilistic forecast skill of the AO in the recent period (1997-2010) is higher than that in the earlier period (1983-1996).

  3. A probabilistic, distributed, recursive mechanism for decision-making in the brain

    PubMed Central

    Gurney, Kevin N.

    2018-01-01

    Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077

  4. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  5. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil

    PubMed Central

    Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-01-01

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315

  6. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor built components.

  7. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  9. A Targeted Health Risk Assessment Following the Deepwater Horizon Oil Spill: Polycyclic Aromatic Hydrocarbon Exposure in Vietnamese-American Shrimp Consumers

    PubMed Central

    Frickel, Scott; Nguyen, Daniel; Bui, Tap; Echsner, Stephen; Simon, Bridget R.; Howard, Jessi L.; Miller, Kent; Wickliffe, Jeffrey K.

    2014-01-01

    Background: The Deepwater Horizon oil spill of 2010 prompted concern about health risks among seafood consumers exposed to polycyclic aromatic hydrocarbons (PAHs) via consumption of contaminated seafood. Objective: The objective of this study was to conduct population-specific probabilistic health risk assessments based on consumption of locally harvested white shrimp (Litopenaeus setiferus) among Vietnamese Americans in southeast Louisiana. Methods: We conducted a survey of Vietnamese Americans in southeast Louisiana to evaluate shrimp consumption, preparation methods, and body weight among shrimp consumers in the disaster-impacted region. We also collected and chemically analyzed locally harvested white shrimp for 81 individual PAHs. We combined the PAH levels (with accepted reference doses) found in the shrimp with the survey data to conduct Monte Carlo simulations for probabilistic noncancer health risk assessments. We also conducted probabilistic cancer risk assessments using relative potency factors (RPFs) to estimate cancer risks from the intake of PAHs from white shrimp. Results: Monte Carlo simulations were used to generate hazard quotient distributions for noncancer health risks, reported as mean ± SD, for naphthalene (1.8 × 10–4 ± 3.3 × 10–4), fluorene (2.4 × 10–5 ± 3.3 × 10–5), anthracene (3.9 × 10–6 ± 5.4 × 10–6), pyrene (3.2 × 10–5 ± 4.3 × 10–5), and fluoranthene (1.8 × 10–4 ± 3.3 × 10–4). A cancer risk distribution, based on RPF-adjusted PAH intake, was also generated (2.4 × 10–7 ± 3.9 × 10–7). Conclusions: The risk assessment results show no acute health risks or excess cancer risk associated with consumption of shrimp containing the levels of PAHs detected in our study, even among frequent shrimp consumers. Citation: Wilson MJ, Frickel S, Nguyen D, Bui T, Echsner S, Simon BR, Howard JL, Miller K, Wickliffe JK. 2015. A targeted health risk assessment following the Deepwater Horizon Oil Spill: polycyclic aromatic hydrocarbon exposure in Vietnamese-American shrimp consumers. Environ Health Perspect 123:152–159; http://dx.doi.org/10.1289/ehp.1408684 PMID:25333566

  10. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  11. A probabilistic approach for mine burial prediction

    NASA Astrophysics Data System (ADS)

    Barbu, Costin; Valent, Philip; Richardson, Michael; Abelev, Andrei; Plant, Nathaniel

    2004-09-01

    Predicting the degree of burial of mines in soft sediments is one of the main concerns of Naval Mine CounterMeasures (MCM) operations. This is a difficult problem to solve due to uncertainties and variability of the sediment parameters (i.e., density and shear strength) and of the mine state at contact with the seafloor (i.e., vertical and horizontal velocity, angular rotation rate, and pitch angle at the mudline). A stochastic approach is proposed in this paper to better incorporate the dynamic nature of free-falling cylindrical mines in the modeling of impact burial. The orientation, trajectory and velocity of cylindrical mines, after about 4 meters free-fall in the water column, are very strongly influenced by boundary layer effects causing quite chaotic behavior. The model's convolution of the uncertainty through its nonlinearity is addressed by employing Monte Carlo simulations. Finally a risk analysis based on the probability of encountering an undetectable mine is performed.

  12. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  13. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  14. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary W.

    2014-01-01

    The International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020, although there are ongoing efforts to extend ISS life cycle through 2028. The EOL for the ISS will require deorbiting the ISS. This will be the largest manmade object ever to be de-orbited therefore safely deorbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  15. Potential Impacts of Accelerated Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. R.; Vail, L. W.

    2016-05-31

    This research project is part of the U.S. Nuclear Regulatory Commission’s (NRC’s) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of developing a risk-informed licensing framework for flood hazards and design standards at proposed new facilities and significance determination tools for evaluating potential deficiencies related to flood protection at operating facilities. The PFHA plan aims to build upon recent advances in deterministic, probabilistic, and statistical modeling of extreme precipitation events to develop regulatory tools and guidance for NRC staff with regard to PFHA for nuclear facilities. The tools and guidance developed under the PFHA plan will support and enhancemore » NRC’s capacity to perform thorough and efficient reviews of license applications and license amendment requests. They will also support risk-informed significance determination of inspection findings, unusual events, and other oversight activities.« less

  16. Distributed collaborative response surface method for mechanical dynamic assembly reliability design

    NASA Astrophysics Data System (ADS)

    Bai, Guangchen; Fei, Chengwei

    2013-11-01

    Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.

  17. Dynamic all-red extension at signalized intersection : probabilistic modeling and algorithm.

    DOT National Transportation Integrated Search

    2011-01-01

    Red light running has been a major cause of intersection injuries and fatalities in the United : States. In 2004 alone, there were 8,619 fatal crashes and 848,000 crashes with people injured, all : caused by RLR. Under the U.S. Department of Transpor...

  18. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  19. Methods for combining payload parameter variations with input environment. [calculating design limit loads compatible with probabilistic structural design criteria

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.

    1976-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.

  20. Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networksmore » and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.« less

  1. High-Fidelity Multi-Rotor Unmanned Aircraft System Simulation Development for Trajectory Prediction Under Off-Nominal Flight Dynamics

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Hartman, David C.

    2017-01-01

    The NASA Unmanned Aircraft System (UAS) Traffic Management (UTM) project is conducting research to enable civilian low-altitude airspace and UAS operations. A goal of this project is to develop probabilistic methods to quantify risk during failures and off nominal flight conditions. An important part of this effort is the reliable prediction of feasible trajectories during off-nominal events such as control failure, atmospheric upsets, or navigation anomalies that can cause large deviations from the intended flight path or extreme vehicle upsets beyond the normal flight envelope. Few examples of high-fidelity modeling and prediction of off-nominal behavior for small UAS (sUAS) vehicles exist, and modeling requirements for accurately predicting flight dynamics for out-of-envelope or failure conditions are essentially undefined. In addition, the broad range of sUAS aircraft configurations already being fielded presents a significant modeling challenge, as these vehicles are often very different from one another and are likely to possess dramatically different flight dynamics and resultant trajectories and may require different modeling approaches to capture off-nominal behavior. NASA has undertaken an extensive research effort to define sUAS flight dynamics modeling requirements and develop preliminary high fidelity six degree-of-freedom (6-DOF) simulations capable of more closely predicting off-nominal flight dynamics and trajectories. This research has included a literature review of existing sUAS modeling and simulation work as well as development of experimental testing methods to measure and model key components of propulsion, airframe and control characteristics. The ultimate objective of these efforts is to develop tools to support UTM risk analyses and for the real-time prediction of off-nominal trajectories for use in the UTM Risk Assessment Framework (URAF). This paper focuses on modeling and simulation efforts for a generic quad-rotor configuration typical of many commercial vehicles in use today. An overview of relevant off-nominal multi-rotor behaviors will be presented to define modeling goals and to identify the prediction capability lacking in simplified models of multi-rotor performance. A description of recent NASA wind tunnel testing of multi-rotor propulsion and airframe components will be presented illustrating important experimental and data acquisition methods, and a description of preliminary propulsion and airframe models will be presented. Lastly, examples of predicted off-nominal flight dynamics and trajectories from the simulation will be presented.

  2. The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.

    2010-01-01

    HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.

  3. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  4. Subsea release of oil from a riser: an ecological risk assessment.

    PubMed

    Nazir, Muddassir; Khan, Faisal; Amyotte, Paul; Sadiq, Rehan

    2008-10-01

    This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC(95%)) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC(95%) and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC(5%)) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC(95%) to PNEC(5%), into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision-making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.

  5. Environmental risk assessment of white phosphorus from the use of munitions - a probabilistic approach.

    PubMed

    Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager

    2010-03-15

    White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.

  6. Sparsity enabled cluster reduced-order models for control

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  7. APPLICATION AND EVALUATION OF AN AGGREGATE PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL FOR QUANTIFYING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS

    EPA Science Inventory

    Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...

  8. 77 FR 58590 - Determining Technical Adequacy of Probabilistic Risk Assessment for Risk-Informed License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... reactors or for activities associated with review of applications for early site permits and combined licenses (COL) for the Office of New Reactors (NRO). DATES: The effective date of this SRP update is... Rulemaking Web Site: Go to http://www.regulations.gov and search for Docket ID NRC-2010-0138. Address...

  9. Validation of a probabilistic post-fire erosion model

    Treesearch

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller

    2016-01-01

    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  10. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state

    PubMed Central

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-01-01

    Background Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning — the “jumping to conclusion” bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). Methods We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. Results We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Limitations Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not pre-psychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. Conclusion As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk. PMID:25622039

  11. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state.

    PubMed

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-05-01

    Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning - the "jumping to conclusion" bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not prepsychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk.

  12. Mode identification using stochastic hybrid models with applications to conflict detection and resolution

    NASA Astrophysics Data System (ADS)

    Naseri Kouzehgarani, Asal

    2009-12-01

    Most models of aircraft trajectories are non-linear and stochastic in nature; and their internal parameters are often poorly defined. The ability to model, simulate and analyze realistic air traffic management conflict detection scenarios in a scalable, composable, multi-aircraft fashion is an extremely difficult endeavor. Accurate techniques for aircraft mode detection are critical in order to enable the precise projection of aircraft conflicts, and for the enactment of altitude separation resolution strategies. Conflict detection is an inherently probabilistic endeavor; our ability to detect conflicts in a timely and accurate manner over a fixed time horizon is traded off against the increased human workload created by false alarms---that is, situations that would not develop into an actual conflict, or would resolve naturally in the appropriate time horizon-thereby introducing a measure of probabilistic uncertainty in any decision aid fashioned to assist air traffic controllers. The interaction of the continuous dynamics of the aircraft, used for prediction purposes, with the discrete conflict detection logic gives rise to the hybrid nature of the overall system. The introduction of the probabilistic element, common to decision alerting and aiding devices, places the conflict detection and resolution problem in the domain of probabilistic hybrid phenomena. A hidden Markov model (HMM) has two stochastic components: a finite-state Markov chain and a finite set of output probability distributions. In other words an unobservable stochastic process (hidden) that can only be observed through another set of stochastic processes that generate the sequence of observations. The problem of self separation in distributed air traffic management reduces to the ability of aircraft to communicate state information to neighboring aircraft, as well as model the evolution of aircraft trajectories between communications, in the presence of probabilistic uncertain dynamics as well as partially observable and uncertain data. We introduce the Hybrid Hidden Markov Modeling (HHMM) formalism to enable the prediction of the stochastic aircraft states (and thus, potential conflicts), by combining elements of the probabilistic timed input output automaton and the partially observable Markov decision process frameworks, along with the novel addition of a Markovian scheduler to remove the non-deterministic elements arising from the enabling of several actions simultaneously. Comparisons of aircraft in level, climbing/descending and turning flight are performed, and unknown flight track data is evaluated probabilistically against the tuned model in order to assess the effectiveness of the model in detecting the switch between multiple flight modes for a given aircraft. This also allows for the generation of probabilistic distribution over the execution traces of the hybrid hidden Markov model, which then enables the prediction of the states of aircraft based on partially observable and uncertain data. Based on the composition properties of the HHMM, we study a decentralized air traffic system where aircraft are moving along streams and can perform cruise, accelerate, climb and turn maneuvers. We develop a common decentralized policy for conflict avoidance with spatially distributed agents (aircraft in the sky) and assure its safety properties via correctness proofs.

  13. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.

  14. Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller.

    PubMed

    Kindermans, Pieter-Jan; Tangermann, Michael; Müller, Klaus-Robert; Schrauwen, Benjamin

    2014-06-01

    Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance--competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.

  15. Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller

    NASA Astrophysics Data System (ADS)

    Kindermans, Pieter-Jan; Tangermann, Michael; Müller, Klaus-Robert; Schrauwen, Benjamin

    2014-06-01

    Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance—competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.

  16. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  17. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  18. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  19. Assessment of uncertainty in discrete fracture network modeling using probabilistic distribution method.

    PubMed

    Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan

    2017-11-01

    There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .

  20. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

Top