Sample records for nasa probabilistic risk

  1. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  2. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  3. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  4. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could

  5. Probabilistic Assessment of Cancer Risk for Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2009-01-01

    During future lunar missions, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon transit. NASA s new lunar program anticipates that up to 15% of crew time may be on EVA, with minimal radiation shielding. For the operational challenge to respond to events of unknown size and duration, a probabilistic risk assessment approach is essential for mission planning and design. Using the historical database of proton measurements during the past 5 solar cycles, a typical hazard function for SPE occurrence was defined using a non-homogeneous Poisson model as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions ranging from the 5th to 95th percentile of particle fluences for a specified mission period were simulated. Organ doses corresponding to particle fluences at the median and at the 95th percentile for a specified mission period were assessed using NASA s baryon transport model, BRYNTRN. The cancer fatality risk for astronauts as functions of age, gender, and solar cycle activity were then analyzed. The probability of exceeding the NASA 30- day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated. Future work will involve using this probabilistic risk assessment approach to SPE forecasting, combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  6. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor

  7. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  10. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  11. NASA Space Radiation Risk Project: Overview and Recent Results

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Chappell, Lori J.; George, Kerry A.; Hada, Megumi; Hu, Shaowen; Kidane, Yared H.; Kim, Myung-Hee Y.; Kovyrshina, Tatiana; Norman, Ryan B.; Nounu, Hatem N.; hide

    2015-01-01

    The NASA Space Radiation Risk project is responsible for integrating new experimental and computational results into models to predict risk of cancer and acute radiation syndrome (ARS) for use in mission planning and systems design, as well as current space operations. The project has several parallel efforts focused on proving NASA's radiation risk projection capability in both the near and long term. This presentation will give an overview, with select results from these efforts including the following topics: verification, validation, and streamlining the transition of models to use in decision making; relative biological effectiveness and dose rate effect estimation using a combination of stochastic track structure simulations, DNA damage model calculations and experimental data; ARS model improvements; pathway analysis from gene expression data sets; solar particle event probabilistic exposure calculation including correlated uncertainties for use in design optimization.

  12. Constellation Probabilistic Risk Assessment (PRA): Design Consideration for the Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis

    2010-01-01

    Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.

  13. Probabilistic Asteroid Impact Risk Assessment for the Hypothetical PDC17 Impact Exercise

    NASA Technical Reports Server (NTRS)

    Wheeler, Lorien; Mathias, Donovan

    2017-01-01

    Performing impact risk assessment for the 2017 Planetary Defense Conference (PDC17) hypothetical impact exercise, to take place at the PDC17 conference, May 15-20, 2017. Impact scenarios and trajectories are developed and provided by NASA's Near Earth Objects Office at JPL (Paul Chodas). These results represent purely hypothetical impact scenarios, and do not reflect any known asteroid threat. Risk assessment was performed using the Probabilistic Asteroid Impact Risk (PAIR) model developed by the Asteroid Threat Assessment Project (ATAP) at NASA Ames Research Center. This presentation includes sample results that may be presented or used in discussions during the various stages of the impact exercisecenter dot Some cases represent alternate scenario options that may not be used during the actual impact exercise at the PDC17 conference. Updates to these initial assessments and/or additional scenario assessments may be performed throughout the impact exercise as different scenario options unfold.

  14. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    NASA Technical Reports Server (NTRS)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  15. Risk Management at NASA and Its Applicability to the Oil and Gas Industry

    NASA Technical Reports Server (NTRS)

    Kaplan, David

    2018-01-01

    NASA has a world-class capability for quantitatively assessing the risk of highly-complex, isolated engineering structures operated in extremely hostile environments. In particular, the International Space Station (ISS) represents a reasonable risk analog for High Pressure, High Temperature drilling and production operations on deepwater rigs. Through a long-term U.S. Government Interagency Agreement, BSEE has partnered with NASA to modify NASA's Probabilistic Risk Assessment (PRA) capabilities for application to deepwater drilling and production operations. The immediate focus of the activity will be to modify NASA PRA Procedure Guides and Methodology Documents to make them applicable to the Oil &Gas Industry. The next step will be for NASA to produce a PRA for a critical drilling system component, such as a Blowout Preventer (BOP). Subsequent activities will be for NASA and industry partners to jointly develop increasingly complex PRA's that analyze other critical drilling and production system components, including both hardware and human reliability. In the presentation, NASA will provide the objectives, schedule, and current status of its PRA activities for BSEE. Additionally, NASA has a Space Act Agreement with Anadarko Petroleum Corporation to develop a PRA for a generic 20K BOP. NASA will summarize some of the preliminary insights gained to date from that 20K BOP PRA as an example of the distinction between quantitative versus qualitative risk assessment.

  16. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will

  17. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  18. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  19. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  20. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  1. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  2. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  3. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  4. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  5. Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.

    PubMed

    Avramenko, M; Bolyatko, V; Kosterev, V

    2005-01-01

    Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.

  6. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  7. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary W.

    2014-01-01

    The International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020, although there are ongoing efforts to extend ISS life cycle through 2028. The EOL for the ISS will require deorbiting the ISS. This will be the largest manmade object ever to be de-orbited therefore safely deorbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  8. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    NASA Technical Reports Server (NTRS)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  9. Probabilistic Description of the Hydrologic Risk in Agriculture

    NASA Astrophysics Data System (ADS)

    Vico, G.; Porporato, A. M.

    2011-12-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climatic variability on agroecosystems productivity and profitability, at the expenses of increasing water requirements for irrigation purposes. Optimizing water allocation for crop yield preservation and sustainable development needs to account for hydro-climatic variability, which is by far the main source of uncertainty affecting crop yields and irrigation water requirements. In this contribution, a widely applicable probabilistic framework is proposed to quantitatively define the hydrologic risk of yield reduction for both rainfed and irrigated agriculture. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season. Based on these linkages, long-term and real-time yield reduction risk indices are defined as a function of climate, soil and crop parameters, as well as irrigation strategy. The former risk index is suitable for long-term irrigation strategy assessment and investment planning, while the latter risk index provides a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season. This probabilistic framework allows also assessing the impact of limited water availability on crop yield, thus guiding the optimal allocation of water resources for human and environmental needs. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios, thus facilitating the assessment of the impact of increasingly frequent water shortages on agricultural productivity, profitability, and sustainability.

  10. Probabilistic Risk Assessment for Astronaut Post Flight Bone Fracture

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth; Myers, Jerry; Licata, Angelo

    2015-01-01

    Introduction: Space flight potentially reduces the loading that bone can resist before fracture. This reduction in bone integrity may result from a combination of factors, the most common reported as reduction in astronaut BMD. Although evaluating the condition of bones continues to be a critical aspect of understanding space flight fracture risk, defining the loading regime, whether on earth, in microgravity, or in reduced gravity on a planetary surface, remains a significant component of estimating the fracture risks to astronauts. This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM) to understanding pre-, post, and in mission astronaut bone fracture risk. The overview includes an assessment of contributing factors utilized in the BFxRM and illustrates how new information, such as biomechanics of space suit design or better understanding of post flight activities may influence astronaut fracture risk. Opportunities for the bone mineral research community to contribute to future model development are also discussed. Methods: To investigate the conditions in which spaceflight induced changes to bone plays a critical role in post-flight fracture probability, we implement a modified version of the NASA Bone Fracture Risk Model (BFxRM). Modifications included incorporation of variations in physiological characteristics, post-flight recovery rate, and variations in lateral fall conditions within the probabilistic simulation parameter space. The modeled fracture probability estimates for different loading scenarios at preflight and at 0 and 365 days post-flight time periods are compared. Results: For simple lateral side falls, mean post-flight fracture probability is elevated over mean preflight fracture probability due to spaceflight induced BMD loss and is not fully recovered at 365 days post-flight. In the case of more energetic falls, such as from elevated heights or with the addition of lateral movement

  11. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  12. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary

    2014-01-01

    Although there are ongoing efforts to extend the ISS life cycle through 2028, the International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020. The EOL for the ISS will require de-orbiting the ISS. This will be the largest manmade object ever to be de-orbited, therefore safely de-orbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities, debris mapping etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  13. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  14. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    NASA Technical Reports Server (NTRS)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  15. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  19. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  20. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  1. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  2. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  3. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less

  4. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  5. Continuous Risk Management at NASA

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.; Rosenberg, Linda

    1999-01-01

    NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions. This risk management structure of functions has been taught to projects at all NASA Centers and is being successfully implemented on many projects. This presentation will give project managers the information they need to understand if risk management is to be effectively implemented on their projects at a cost they can afford.

  6. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  7. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  8. Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions

    EPA Pesticide Factsheets

    General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.

  9. Achieving a Risk-Informed Decision-Making Environment at NASA: The Emphasis of NASA's Risk Management Policy

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.

  10. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  11. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  12. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  13. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  14. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  15. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  16. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  17. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  18. A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites

    NASA Astrophysics Data System (ADS)

    Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.

    2010-12-01

    Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.

  19. Continuous Risk Management: A NASA Program Initiative

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.; Rosenberg, Linda

    1999-01-01

    NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  20. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  1. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and

  2. Adolescents' Heightened Risk-Seeking in a Probabilistic Gambling Task

    ERIC Educational Resources Information Center

    Burnett, Stephanie; Bault, Nadege; Coricelli, Giorgio; Blakemore, Sarah-Jayne

    2010-01-01

    This study investigated adolescent males' decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (aged 12-15), 20 mid-adolescents (aged 15-18) and 17…

  3. Finite element probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacvarov, D.C.

    1981-01-01

    A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less

  4. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  5. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  6. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  7. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 5: Auxiliary shuttle risk analyses

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.

  8. PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.

    1998-01-01

    PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.

  9. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  11. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  12. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  13. NASA Risk Management Handbook. Version 1.0

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Maggio, Gaspare; Stamatelatos, Michael; Youngblood, Robert; Guarro, Sergio; Rutledge, Peter; Sherrard, James; Smith, Curtis; hide

    2011-01-01

    The purpose of this handbook is to provide guidance for implementing the Risk Management (RM) requirements of NASA Procedural Requirements (NPR) document NPR 8000.4A, Agency Risk Management Procedural Requirements [1], with a specific focus on programs and projects, and applying to each level of the NASA organizational hierarchy as requirements flow down. This handbook supports RM application within the NASA systems engineering process, and is a complement to the guidance contained in NASA/SP-2007-6105, NASA Systems Engineering Handbook [2]. Specifically, this handbook provides guidance that is applicable to the common technical processes of Technical Risk Management and Decision Analysis established by NPR 7123.1A, NASA Systems Engineering Process and Requirements [3]. These processes are part of the \\Systems Engineering Engine. (Figure 1) that is used to drive the development of the system and associated work products to satisfy stakeholder expectations in all mission execution domains, including safety, technical, cost, and schedule. Like NPR 7123.1A, NPR 8000.4A is a discipline-oriented NPR that intersects with product-oriented NPRs such as NPR 7120.5D, NASA Space Flight Program and Project Management Requirements [4]; NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Management Requirements [5]; and NPR 7120.8, NASA Research and Technology Program and Project Management Requirements [6]. In much the same way that the NASA Systems Engineering Handbook is intended to provide guidance on the implementation of NPR 7123.1A, this handbook is intended to provide guidance on the implementation of NPR 8000.4A. 1.2 Scope and Depth This handbook provides guidance for conducting RM in the context of NASA program and project life cycles, which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant

  14. Design for Reliability and Safety Approach for the New NASA Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Weldon, Danny M.

    2007-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new launch vehicles, the ARES I and ARES V. Specifically, the paper addresses the use of an integrated probabilistic functional analysis to support the design analysis cycle and a probabilistic risk assessment (PRA) to support the preliminary design and beyond.

  15. The NASA Risk Management Program

    NASA Technical Reports Server (NTRS)

    Buchbinder, Benjamin

    1990-01-01

    This paper describes the NASA Risk Management Program established by the Headquarters Office of Safety and Mission Quality (MSQ). Current agency policy is outlined, risk management assistance to the field is described, and examples are given of independent risk assessments conducted by SMQ. The motivation for and the structure of the program is placed in the historical context of pre- and post-Challenger environments.

  16. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  17. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  18. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  19. NASA Risk-Informed Decision Making Handbook

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Stamatelatos, Michael; Maggio, Gaspare; Everett, Christopher; Youngblood, Robert; Rutledge, Peter; Benjamin, Allan; Williams, Rodney; Smith, Curtis; Guarro, Sergio

    2010-01-01

    This handbook provides guidance for conducting risk-informed decision making in the context of NASA risk management (RM), with a focus on the types of direction-setting key decisions that are characteristic of the NASA program and project life cycles, and which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant to be prescriptive. Instead, it is meant to be general enough, and contain a sufficient diversity of examples, to enable the reader to adapt the methods as needed to the particular decision problems that he or she faces. The handbook highlights major issues to consider when making decisions in the presence of potentially significant uncertainty, so that the user is better able to recognize and avoid pitfalls that might otherwise be experienced.

  20. Design for Reliability and Safety Approach for the NASA New Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal, M.; Weldon, Danny M.

    2007-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new crew launch vehicle called ARES I. The ARES I is being developed by NASA Marshall Space Flight Center (MSFC) in support of the Constellation program. The ARES I consists of three major Elements: A solid First Stage (FS), an Upper Stage (US), and liquid Upper Stage Engine (USE). Stacked on top of the ARES I is the Crew exploration vehicle (CEV). The CEV consists of a Launch Abort System (LAS), Crew Module (CM), Service Module (SM), and a Spacecraft Adapter (SA). The CEV development is being led by NASA Johnson Space Center (JSC). Designing for high reliability and safety require a good integrated working environment and a sound technical design approach. The "Design for Reliability and Safety" approach addressed in this paper discusses both the environment and the technical process put in place to support the ARES I design. To address the integrated working environment, the ARES I project office has established a risk based design group called "Operability Design and Analysis" (OD&A) group. This group is an integrated group intended to bring together the engineering, design, and safety organizations together to optimize the system design for safety, reliability, and cost. On the technical side, the ARES I project has, through the OD

  1. Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie

    2006-01-01

    A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool

  2. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  3. Development of Risk Uncertainty Factors from Historical NASA Projects

    NASA Technical Reports Server (NTRS)

    Amer, Tahani R.

    2011-01-01

    NASA is a good investment of federal funds and strives to provide the best value to the nation. NASA has consistently budgeted to unrealistic cost estimates, which are evident in the cost growth in many of its programs. In this investigation, NASA has been using available uncertainty factors from the Aerospace Corporation, Air Force, and Booz Allen Hamilton to develop projects risk posture. NASA has no insight into the developmental of these factors and, as demonstrated here, this can lead to unrealistic risks in many NASA Programs and projects (P/p). The primary contribution of this project is the development of NASA missions uncertainty factors, from actual historical NASA projects, to aid cost-estimating as well as for independent reviews which provide NASA senior management with information and analysis to determine the appropriate decision regarding P/p. In general terms, this research project advances programmatic analysis for NASA projects.

  4. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  5. The NASA Continuous Risk Management Process

    NASA Technical Reports Server (NTRS)

    Pokorny, Frank M.

    2004-01-01

    As an intern this summer in the GRC Risk Management Office, I have become familiar with the NASA Continuous Risk Management Process. In this process, risk is considered in terms of the probability that an undesired event will occur and the impact of the event, should it occur (ref., NASA-NPG: 7120.5). Risk management belongs in every part of every project and should be ongoing from start to finish. Another key point is that a risk is not a problem until it has happened. With that in mind, there is a six step cycle for continuous risk management that prevents risks from becoming problems. The steps are: identify, analyze, plan, track, control, and communicate & document. Incorporated in the first step are several methods to identify risks such as brainstorming and using lessons learned. Once a risk is identified, a risk statement is made on a risk information sheet consisting of a single condition and one or more consequences. There can also be a context section where the risk is explained in more detail. Additionally there are three main goals of analyzing a risk, which are evaluate, classify, and prioritize. Here is where a value is given to the attributes of a risk &e., probability, impact, and timeframe) based on a multi-level classification system (e.g., low, medium, high). It is important to keep in mind that the definitions of these levels are probably different for each project. Furthermore the risks can be combined into groups. Then, the risks are prioritized to see what risk is necessary to mitigate first. After the risks are analyzed, a plan is made to mitigate as many risks as feasible. Each risk should be assigned to someone in the project with knowledge in the area of the risk. Then the possible approaches to choose from are: research, accept, watch, or mitigate. Next, all risks, mitigated or not, are tracked either individually or in groups. As the plan is executed, risks are re-evaluated, and the attribute values are adjusted as necessary. Metrics

  6. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  7. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  8. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  9. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  10. A probabilistic topic model for clinical risk stratification from electronic health records.

    PubMed

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  11. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  12. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon

    2008-01-01

    NASA's Agency-wide.resource for identifying and managing risks associated with changing environmental regulations Goals of the RRAC PC: 1) Proactively. detect, analyze and communicate environmental regulatory risks to NASA Programs and facilities; 2) Communicate with regulators and participate in the mitigation of such risks; and 3) Provide centralized support on emerging regulations to NASA HQ Environmental Management Division. When significant regulatory changes are identified, timely communication is essential. Communication of changing requirements to the regulatory stakeholders - NASA Programs and Facilities. Communication of potential issues to management and, when appropriate, back to the regulating agency.

  13. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  14. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  16. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  17. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  18. A Probabilistic Assessment of NASA Ultra-Efficient Engine Technologies for a Large Subsonic Transport

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Jones, Scott M.; Arcara, Philip C., Jr.; Haller, William J.

    2004-01-01

    NASA's Ultra Efficient Engine Technology (UEET) program features advanced aeropropulsion technologies that include highly loaded turbomachinery, an advanced low-NOx combustor, high-temperature materials, intelligent propulsion controls, aspirated seal technology, and an advanced computational fluid dynamics (CFD) design tool to help reduce airplane drag. A probabilistic system assessment is performed to evaluate the impact of these technologies on aircraft fuel burn and NOx reductions. A 300-passenger aircraft, with two 396-kN thrust (85,000-pound) engines is chosen for the study. The results show that a large subsonic aircraft equipped with the UEET technologies has a very high probability of meeting the UEET Program goals for fuel-burn (or equivalent CO2) reduction (15% from the baseline) and LTO (landing and takeoff) NOx reductions (70% relative to the 1996 International Civil Aviation Organization rule). These results are used to provide guidance for developing a robust UEET technology portfolio, and to prioritize the most promising technologies required to achieve UEET program goals for the fuel-burn and NOx reductions.

  19. Probabilistic assessment of roadway departure risk in a curve

    NASA Astrophysics Data System (ADS)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  20. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  1. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  2. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its

  3. Are engineered nano iron oxide particles safe? an environmental risk assessment by probabilistic exposure, effects and risk modeling.

    PubMed

    Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd

    2016-12-01

    Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.

  4. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  5. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  6. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  7. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon; Duda, Kristen

    2008-01-01

    This viewgraph presentation gives an overview of NASA's risk analysis communication programs associated with changing environmental policies. The topics include: 1) NASA Program Transition; 2) Principal Center for Regulatory Risk Analysis and Communication (RRAC PC); and 3) Regulatory Tracking and Communication Process.

  8. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  10. Probabilistic Risk Assessment for Bone Fracture - Bone Fracture Risk Module (BFxRM)

    NASA Technical Reports Server (NTRS)

    Licata, Angelo; Myers, Jerry G.; Lewandowski, Beth

    2013-01-01

    This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM). The overview includes an assessmnet of strenghts and limitations of the BFxRM and proposes a numebr of discussion questions to the panel regarding future development avenues for this simulation system.

  11. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Duda, Kristen; Scroggins, Sharon

    2008-01-01

    NASA's mission is to pioneer the future in space exploration, scientific discovery, and aeronautics research. To help enable existing and future programs to pursue this mission, NASA has established the Principal Center for Regulatory Risk Analysis and Communication (RRAC PC) to proactively identify, analyze, and communicate environmental regulatory risks to the NASA community. The RRAC PC is chartered to evaluate the risks posed to NASA Programs and facilities by environmentally related drivers. The RRAC PC focuses on emerging environmental regulations, as well as risks related to operational changes that can trigger existing environmental requirements. Changing regulations have the potential to directly affect program activities. For example, regulatory changes can restrict certain activities or operations by mandating changes in how operations may be done or limiting where or how certain operations can take place. Regulatory changes also can directly affect the ability to use certain materials by mandating a production phase-out or restricting usage applications of certain materials. Such changes can result in NASA undertaking material replacement efforts. Even if a regulation does not directly affect NASA operations, U.S. and international regulations can pose program risks indirectly through requirements levied on manufacturers and vendors of components and materials. For example, manufacturers can change their formulations to comply with new regulatory requirements. Such changes can require time-consuming and costly requalification certification for use in human spaceflight programs. The RRAC PC has implemented several strategies for proactively managing regulatory change to minimize potential adverse impacts to NASA Programs and facilities. This presentation highlights the lessons learned through establishing the RRAC PC, the process by which the RRAC PC monitors and distributes information about emerging regulatory requirements, and the cross

  12. NASA's Agency-wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Duda, Kristen; Scroggins. Sharon

    2008-01-01

    NASA's mission is to pioneer the future in space exploration, scientific discovery, and aeronautics research. To help enable existing and future programs to pursue this mission, NASA has established the Principal Center for Regulatory Risk Analysis and Communication (RRAC PC) to proactively identify, analyze, and communicate environmental regulatory risks to the NASA community. The RRAC PC is chartered to evaluate the risks posed to NASA Programs and facilities by environmentally related drivers. The RRAC PC focuses on emerging environmental regulations, as well as risks related to operational changes that can trigger existing environmental requirements. Changing regulations have the potential to directly affect program activities. For example, regulatory changes can restrict certain activities or operations by mandating changes in how operations may be done or limiting where or how certain operations can take place. Regulatory changes also can directly affect the ability to use certain materials by mandating a production phase-out or restricting usage aPi'iications of certain materials. Such changes can result in NASA undertaking material replacement efforts. Even if a regulation does not directly affect NASA operations, U.S. and international regulations can pose program risks indirectly through requirements levied on manufacturers and vendors of components and materials. For example, manufacturers can change their formulations to comply with new regulatory requirements. Such changes can require time-consuming and costly requalification certification for use in human spaceflight programs. The RRAC PC has implemented several strategies for proactively managing regulatory change to minimize potential adverse impacts to NASA Programs and facilities. This presentation highlights the lessons learned through establishing the RRAC PC, the process by which the RRAC PC monitors and distributes information about emerging regulatory requirements, and the cross

  13. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  14. A Probabilistic Risk Analysis (PRA) of Human Space Missions for the Advanced Integration Matrix (AIM)

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.

    2003-01-01

    The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.

  15. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  16. Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.

    PubMed

    Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente

    2009-12-20

    This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.

  17. Extravehicular Activity Probabilistic Risk Assessment Overview for Thermal Protection System Repair on the Hubble Space Telescope Servicing Mission

    NASA Technical Reports Server (NTRS)

    Bigler, Mark; Canga, Michael A.; Duncan, Gary

    2010-01-01

    The Shuttle Program initiated an Extravehicular Activity (EVA) Probabilistic Risk Assessment (PRA) to assess the risks associated with performing a Shuttle Thermal Protection System (TPS) repair during the Space Transportation System (STS)-125 Hubble repair mission as part of risk trades between TPS repair and crew rescue.

  18. NASA's human system risk management approach and its applicability to commercial spaceflight.

    PubMed

    Law, Jennifer; Mathers, Charles H; Fondy, Susan R E; Vanderploeg, James M; Kerstman, Eric L

    2013-01-01

    As planning continues for commercial spaceflight, attention is turned to NASA to assess whether its human system risk management approach can be applied to mitigate the risks associated with commercial suborbital and orbital flights. NASA uses a variety of methods to assess the risks to the human system based on their likelihood and consequences. In this article, we review these methods and categorize the risks in the system as "definite," "possible," or "least" concern for commercial spaceflight. As with career astronauts, these risks will be primarily mitigated by screening and environmental control. Despite its focus on long-duration exploration missions, NASA's human system risk management approach can serve as a preliminary knowledge base to help medical planners prepare for commercial spaceflights.

  19. Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires

    Treesearch

    Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes

    2010-01-01

    Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...

  20. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  1. NASA's Human Research Program at The Glenn Research Center: Progress and Opportunities

    NASA Technical Reports Server (NTRS)

    Nall, Marsha; Griffin, DeVon; Myers, Jerry; Perusek, Gail

    2008-01-01

    The NASA Human Research Program is aimed at correcting problems in critical areas that place NASA human spaceflight missions at risk due to shortfalls in astronaut health, safety and performance. The Glenn Research Center (GRC) and partners from Ohio are significant contributors to this effort. This presentation describes several areas of GRC emphasis, the first being NASA s path to creating exercise hardware requirements and protocols that mitigate the effects of long duration spaceflight. Computational simulations will be a second area that is discussed. This includes deterministic models that simulate the effects of spaceflight on the human body, as well as probabilistic models that bound and quantify the probability that adverse medical incidents will happen during an exploration mission. Medical technology development for exploration will be the final area to be discussed.

  2. Implementation of Risk Management in NASA's CEV Project- Ensuring Mission Success

    NASA Astrophysics Data System (ADS)

    Perera, Jeevan; Holsomback, Jerry D.

    2005-12-01

    Most project managers know that Risk Management (RM) is essential to good project management. At NASA, standards and procedures to manage risk through a tiered approach have been developed - from the global agency-wide requirements down to a program or project's implementation. The basic methodology for NASA's risk management strategy includes processes to identify, analyze, plan, track, control, communicate and document risks. The identification, characterization, mitigation plan, and mitigation responsibilities associated with specific risks are documented to help communicate, manage, and effectuate appropriate closure. This approach helps to ensure more consistent documentation and assessment and provides a means of archiving lessons learned for future identification or mitigation activities.A new risk database and management tool was developed by NASA in 2002 and since has been used successfully to communicate, document and manage a number of diverse risks for the International Space Station, Space Shuttle, and several other NASA projects and programs including at the Johnson Space Center. Organizations use this database application to effectively manage and track each risk and gain insight into impacts from other organization's viewpoint to develop integrated solutions. Schedule, cost, technical and safety issues are tracked in detail through this system.Risks are tagged within the system to ensure proper review, coordination and management at the necessary management level. The database is intended as a day-to- day tool for organizations to manage their risks and elevate those issues that need coordination from above. Each risk is assigned to a managing organization and a specific risk owner who generates mitigation plans as appropriate. In essence, the risk owner is responsible for shepherding the risk through closure. The individual that identifies a new risk does not necessarily get assigned as the risk owner. Whoever is in the best position to effectuate

  3. The NASA Space Radiobiology Risk Assessment Project

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee

    The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.

  4. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.

  5. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operatingmore » plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.« less

  6. Probabilistic health risk assessment for arsenic intake through drinking groundwater in Taiwan's Pingtung Plain

    NASA Astrophysics Data System (ADS)

    Liang, C. P.; Chen, J. S.

    2017-12-01

    An abundant and inexpensive supply of groundwater is used to meet drinking, agriculture and aquaculture requirements of the residents in the Pingtung Plain. Long-term groundwater quality monitoring data indicate that the As content in groundwater in the Pingtung Plain exceeds the maximum level of 10 g/L recommended by the World Health Organization (WHO). The situation is further complicated by the fact that only 46.89% of population in the Pingtung Plain has been served with tap water, far below the national average of 92.93%. Considering there is a considerable variation in the measured concentrations, from below the detection limit (<0.1 g/L) to the maximum value of 544 g/L and the consumption rate and body weight of the individual, the conventional approach to conducting a human health risk assessment may be insufficient for health risk management. This study presents a probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater by local residents in the Pingtung Plain. The probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater is achieved using Monte Carlo simulation technique based on the hazard quotient (HQ) and target cancer risk (TR) established by the U.S. Environmental Protection Agency. This study demonstrates the importance of the individual variability of inorganic As intake through drinking groundwater consumption when evaluating a high exposure sub-group of the population who drink high As content groundwater.

  7. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  8. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  9. Space Shuttle Probabilistic Risk Assessment (SPRA) Iteration 3.2

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.

    2010-01-01

    The Shuttle is a very reliable vehicle in comparison with other launch systems. Much of the risk posed by Shuttle operations is related to fundamental aspects of the spacecraft design and the environments in which it operates. It is unlikely that significant design improvements can be implemented to address these risks prior to the end of the Shuttle program. The model will continue to be used to identify possible emerging risk drivers and allow management to make risk-informed decisions on future missions. Potential uses of the SPRA in the future include: - Calculate risk impact of various mission contingencies (e.g. late inspection, crew rescue, etc.). - Assessing the risk impact of various trade studies (e.g. flow control valves). - Support risk analysis on mission specific events, such as in flight anomalies. - Serve as a guiding star and data source for future NASA programs.

  10. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  11. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  12. A mediation model to explain decision making under conditions of risk among adolescents: the role of fluid intelligence and probabilistic reasoning.

    PubMed

    Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina

    2014-01-01

    This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.

  13. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  14. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  15. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  16. Taking the Risk Out of Risk Assessment

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The ability to understand risks and have the right strategies in place when risky events occur is essential in the workplace. More and more organizations are being confronted with concerns over how to measure their risks or what kind of risks they can take when certain events transpire that could have a negative impact. NASA is one organization that faces these challenges on a daily basis, as effective risk management is critical to the success of its missions especially the Space Shuttle missions. On July 29, 1996, former NASA Administrator Daniel Goldin charged NASA s Office of Safety and Mission Assurance with developing a probabilistic risk assessment (PRA) tool to support decisions on the funding of Space Shuttle upgrades. When issuing the directive, Goldin said, "Since I came to NASA [in 1992], we've spent billions of dollars on Shuttle upgrades without knowing how much they improve safety. I want a tool to help base upgrade decisions on risk." Work on the PRA tool began immediately. The resulting prototype, the Quantitative Risk Assessment System (QRAS) Version 1.0, was jointly developed by NASA s Marshall Space Flight Center, its Office of Safety and Mission Assurance, and researchers at the University of Maryland. QRAS software automatically expands the reliability logic models of systems to evaluate the probability of highly detrimental outcomes occurring in complex systems that are subject to potential accident scenarios. Even in its earliest forms, QRAS was used to begin PRA modeling of the Space Shuttle. In parallel, the development of QRAS continued, with the goal of making it a world-class tool, one that was especially suited to NASA s unique needs. From the beginning, an important conceptual goal in the development of QRAS was for it to help bridge the gap between the professional risk analyst and the design engineer. In the past, only the professional risk analyst could perform, modify, use, and perhaps even adequately understand PRA. NASA wanted

  17. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2006-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period

  18. Managing Space Radiation Risks On Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period

  19. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.; Ponomarev, A.; Ren, L.; Shavers, M. R.; Wu, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period

  20. NASA Human System Risk Assessment Process

    NASA Technical Reports Server (NTRS)

    Francisco, D.; Romero, E.

    2016-01-01

    NASA utilizes an evidence based system to perform risk assessments for the human system for spaceflight missions. The center of this process is the multi-disciplinary Human System Risk Board (HSRB). The HSRB is chartered from the Chief Health and Medical Officer (OCHMO) at NASA Headquarters. The HSRB reviews all human system risks via an established comprehensive risk and configuration management plan based on a project management approach. The HSRB facilitates the integration of human research (terrestrial and spaceflight), medical operations, occupational surveillance, systems engineering and many other disciplines in a comprehensive review of human system risks. The HSRB considers all factors that influence human risk. These factors include pre-mission considerations such as screening criteria, training, age, sex, and physiological condition. In mission factors such as available countermeasures, mission duration and location and post mission factors such as time to return to baseline (reconditioning), post mission health screening, and available treatments. All of the factors influence the total risk assessment for each human risk. The HSRB performed a comprehensive review of all potential inflight medical conditions and events and over the course of several reviews consolidated the number of human system risks to 30, where the greatest emphasis is placed for investing program dollars for risk mitigation. The HSRB considers all available evidence from human research and, medical operations and occupational surveillance in assessing the risks for appropriate mitigation and future work. All applicable DRMs (low earth orbit for 6 and 12 months, deep space for 30 days and 1 year, a lunar mission for 1 year, and a planetary mission for 3 years) are considered as human system risks are modified by the hazards associated with space flight such as microgravity, exposure to radiation, distance from the earth, isolation and a closed environment. Each risk has a summary

  1. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  2. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford Kuofei

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less

  3. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less

  4. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  5. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  6. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  7. Review of NASA approach to space radiation risk assessments for Mars exploration.

    PubMed

    Cucinotta, Francis A

    2015-02-01

    Long duration space missions present unique radiation protection challenges due to the complexity of the space radiation environment, which includes high charge and energy particles and other highly ionizing radiation such as neutrons. Based on a recommendation by the National Council on Radiation Protection and Measurements, a 3% lifetime risk of exposure-induced death for cancer has been used as a basis for risk limitation by the National Aeronautics and Space Administration (NASA) for low-Earth orbit missions. NASA has developed a risk-based approach to radiation exposure limits that accounts for individual factors (age, gender, and smoking history) and assesses the uncertainties in risk estimates. New radiation quality factors with associated probability distribution functions to represent the quality factor's uncertainty have been developed based on track structure models and recent radiobiology data for high charge and energy particles. The current radiation dose limits are reviewed for spaceflight and the various qualitative and quantitative uncertainties that impact the risk of exposure-induced death estimates using the NASA Space Cancer Risk (NSCR) model. NSCR estimates of the number of "safe days" in deep space to be within exposure limits and risk estimates for a Mars exploration mission are described.

  8. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000

  9. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  10. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    PubMed

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  11. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  12. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  13. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  14. Methods and Techniques for Risk Prediction of Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Hoffman, Chad R.; Pugh, Rich; Safie, Fayssal

    1998-01-01

    Since the Space Shuttle Accident in 1986, NASA has been trying to incorporate probabilistic risk assessment (PRA) in decisions concerning the Space Shuttle and other NASA projects. One major study NASA is currently conducting is in the PRA area in establishing an overall risk model for the Space Shuttle System. The model is intended to provide a tool to predict the Shuttle risk and to perform sensitivity analyses and trade studies including evaluation of upgrades. Marshall Space Flight Center (MSFC) and its prime contractors including Pratt and Whitney (P&W) are part of the NASA team conducting the PRA study. MSFC responsibility involves modeling the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). A major challenge that faced the PRA team is modeling the shuttle upgrades. This mainly includes the P&W High Pressure Fuel Turbopump (HPFTP) and the High Pressure Oxidizer Turbopump (HPOTP). The purpose of this paper is to discuss the various methods and techniques used for predicting the risk of the P&W redesigned HPFTP and HPOTP.

  15. NASA Glenn Research Center Overview

    NASA Technical Reports Server (NTRS)

    Sehra, Arun K.

    2002-01-01

    This viewgraph presentation provides information on the NASA Glenn Research Center. The presentation is a broad overview, including the chain of command at the center, its aeronautics facilities, and the factors which shape aerospace product line integration at the center. Special attention is given to the future development of high fidelity probabilistic methods, and NPSS (Numerical Propulsion System Simulation).

  16. Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.P.; Stover, R.L.; Hashimoto, P.S.

    1989-01-01

    Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.

  17. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  18. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    . CONCLUSION: The IMM Database in junction with the IMM is helping NASA aerospace program improve the health care and reduce risk for the astronauts crew. Both the database and model will continue to expand to meet customer needs through its multi-disciplinary evidence based approach to managing data. Future expansion could serve as a platform for a Space Medicine Wiki of medical conditions.

  19. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  20. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  1. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  2. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    PubMed

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  3. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  4. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance

  5. Framework for probabilistic flood risk assessment in an Alpine region

    NASA Astrophysics Data System (ADS)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the

  6. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model1 reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties--2010 . The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables. The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements. However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as

  7. CARES/Life Used for Probabilistic Characterization of MEMS Pressure Sensor Membranes

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2002-01-01

    Microelectromechanical systems (MEMS) devices are typically made from brittle materials such as silicon using traditional semiconductor manufacturing techniques. They can be etched (or micromachined) from larger structures or can be built up with material deposition processes. Maintaining dimensional control and consistent mechanical properties is considerably more difficult for MEMS because feature size is on the micrometer scale. Therefore, the application of probabilistic design methodology becomes necessary for MEMS. This was demonstrated at the NASA Glenn Research Center and Case Western Reserve University in an investigation that used the NASA-developed CARES/Life brittle material design program to study the probabilistic fracture strength behavior of single-crystal SiC, polycrystalline SiC, and amorphous Si3N4 pressurized 1-mm-square thin-film diaphragms. These materials are of interest because of their superior high-temperature characteristics, which are desirable for harsh environment applications such as turbine engine and rocket propulsion system hot sections.

  8. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  9. Fifth Annual Workshop on the Application of Probabilistic Methods for Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Briscoe, Victoria (Compiler)

    2002-01-01

    These are the proceedings of the 5th Annual FAA/Air Force/NASA/Navy Workshop on the Probabilistic Methods for Gas Turbine Engines hosted by NASA Glenn Research Center and held at the Holiday Inn Cleveland West. The history of this series of workshops stems from the recognition that both military and commercial aircraft engines are inevitably subjected to similar design and manufacturing principles. As such, it was eminently logical to combine knowledge bases on how some of these overlapping principles and methodologies are being applied. We have started the process by creating synergy and cooperation between the FAA, Air Force, Navy, and NASA in these workshops. The recent 3-day workshop was specifically designed to benefit the development of probabilistic methods for gas turbine engines by addressing recent technical accomplishments and forging new ideas. We accomplished our goals of minimizing duplication, maximizing the dissemination of information, and improving program planning to all concerned. This proceeding includes the final agenda, abstracts, presentations, and panel notes, plus the valuable contact information from our presenters and attendees. We hope that this proceeding will be a tool to enhance understanding of the developers and users of probabilistic methods. The fifth workshop doubled its attendance and had the success of collaboration with the many diverse groups represented including government, industry, academia, and our international partners. So, "Start your engines!" and utilize these proceedings towards creating safer and more reliable gas turbine engines for our commercial and military partners.

  10. WIPCast: Probabilistic Forecasting for Aviation Decision Aid Applications

    DTIC Science & Technology

    2011-06-01

    traders, or families planning an outing – manage weather-related risk. By quantifying risk , probabilistic forecasting enables optimization of actions via...confidence interval to the user’s risk tolerance helps drive highly effective and innovative decision support mechanisms for visually quantifying risk for

  11. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  12. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    range from simple elicitation of data from a subject matter expert to calibrate a probability distribution to more advanced stochastic modelling. This approach can be referred to more as a proficiency in the language of uncertainty rather than modelling per se in the sense that it allows for greater flexibility to adapt a given context. In a real decision making context, one seldom has neither time nor budget resources to investigate all of these variables thoroughly, hence the importance of being able to prioritize the level of effort among them. Under the proposed framework, this can be done in an optimised fashion. The point here consists in applying probabilistic sensitivity analysis together with the fundamentals of the economic value of information; the framework as built is well suited to such considerations, and variables can be ranked according to their contribution to risk understanding. Efforts to deal with second order uncertainties on variables prove to be valuable when dealing with the economic value of sample information.

  13. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  14. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  15. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models

  16. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties 2010 (Cucinotta et al., 2011). The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables cited more formally as Cucinotta et al. (2011). The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. They include, more recently: (1) The "BEIR VII Phase 2" report from the NRC's Committee on Biological Effects of Ionizing Radiation (BEIR) (NRC, 2006); (2) Studies of Radiation and Cancer from the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR, 2006), (3) The 2007 Recommendations of the International Commission on Radiological Protection (ICRP), ICRP Publication 103 (ICRP, 2007); and (4) The Environmental Protection Agency s (EPA s) report EPA Radiogenic Cancer Risk Models and Projections for the U.S. Population (EPA, 2011). The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for

  17. Development of Risk Assessment Matrix for NASA Engineering and Safety Center

    NASA Technical Reports Server (NTRS)

    Malone, Roy W., Jr.; Moses, Kelly

    2004-01-01

    This paper describes a study, which had as its principal goal the development of a sufficiently detailed 5 x 5 Risk Matrix Scorecard. The purpose of this scorecard is to outline the criteria by which technical issues can be qualitatively and initially prioritized. The tool using this score card has been proposed to be one of the information resources the NASA Engineering and Safety Center (NESC) takes into consideration when making decisions with respect to incoming information on safety concerns across the entire NASA agency. The contents of this paper discuss in detail each element of the risk matrix scorecard, definitions for those elements and the rationale behind the development of those definitions. This scorecard development was performed in parallel with the tailoring of the existing Futron Corporation Integrated Risk Management Application (IRMA) software tool. IRMA was tailored to fit NESC needs for evaluating incoming safety concerns and was renamed NESC Assessment Risk Management Application (NAFMA) which is still in developmental phase.

  18. GEO Collisional Risk Assessment Based on Analysis of NASA-WISE Data and Modeling

    DTIC Science & Technology

    2015-10-18

    GEO Collisional Risk Assessment Based on Analysis of NASA -WISE Data and Modeling Jeremy Murray Krezan1, Samantha Howard1, Phan D. Dao1, Derek...Surka2 1AFRL Space Vehicles Directorate,2Applied Technology Associates Incorporated From December 2009 through 2011 the NASA Wide-Field Infrared...of known debris. The NASA -WISE GEO belt debris population adds potentially thousands previously uncataloged objects. This paper describes

  19. Probalistic Assessment of Radiation Risk for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2008-01-01

    For long duration missions outside of the protection of the Earth's magnetic field, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon or Earth-to-Mars transit. The large majority (90%) of SPEs have small or no health consequences because the doses are low and the particles do not penetrate to organ depths. However, there is an operational challenge to respond to events of unknown size and duration. We have developed a probabilistic approach to SPE risk assessment in support of mission design and operational planning. Using the historical database of proton measurements during the past 5 solar cycles, the functional form of hazard function of SPE occurrence per cycle was found for nonhomogeneous Poisson model. A typical hazard function was defined as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions of particle fluences for a specified mission period were simulated ranging from its 5th to 95th percentile. Organ doses from large SPEs were assessed using NASA's Baryon transport model, BRYNTRN. The SPE risk was analyzed with the organ dose distribution for the given particle fluences during a mission period. In addition to the total particle fluences of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the cancer risk associated with energetic particles for large events. The probability of exceeding the NASA 30-day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated for various SPE sizes. This probabilistic approach to SPE protection will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks in future work.

  20. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    PubMed Central

    Kaufman, Leyla V.; Wright, Mark G.

    2017-01-01

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments. PMID:28686180

  1. Engineering Management Capstone Project EM 697: Compare and Contrast Risk Management Implementation at NASA and the US Army

    NASA Technical Reports Server (NTRS)

    Brothers, Mary Ann; Safie, Fayssal M. (Technical Monitor)

    2002-01-01

    NASA at Marshall Space Flight Center (MSFC) and the U.S. Army at Redstone Arsenal were analyzed to determine whether they were successful in implementing their risk management program. Risk management implementation surveys were distributed to aid in this analysis. The scope is limited to NASA S&MA (Safety and Mission Assurance) at MSFC, including applicable support contractors, and the US Army Engineering Directorate, including applicable contractors, located at Redstone Arsenal. NASA has moderately higher risk management implementation survey scores than the Army. Accordingly, the implementation of the risk management program at NASA is considered good while only two of five of the survey categories indicated that the risk management implementation is good at the Army.

  2. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Wahl, Thomas; Plant, Nathaniel G.; Long, Joseph W.

    2016-05-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  3. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    USGS Publications Warehouse

    Plant, Nathaniel G.; Wahl, Thomas; Long, Joseph W.

    2016-01-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  4. Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris

    2014-08-01

    We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.

  5. Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan

    2005-01-01

    Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.

  6. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  7. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  8. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this

  9. Proceedings from the 2001 NASA Occupational Health Conference: Risk Assessment and Management in 2001

    NASA Technical Reports Server (NTRS)

    Roberson, Sheri (Editor); Kelly, Bruce (Editor); Gettleman, Alan G. (Technical Monitor)

    2001-01-01

    This Conference convened approximately 86 registered participants of invited guest speakers, NASA presenters, and a broad spectrum of the Occupational Health disciplines representing NASA Headquarters and all NASA Field Centers. Two days' Professional Development Courses on Exposure Assessment Strategies and Statistics and on Advanced Cardiac Life Support training and recertification preceded the Conference. With the theme, 'Risk Assessment and Management in 2001,' conferees were first provided updates from the Program Principal Center Office and the Headquarters Office. Plenary sessions elaborated on several topics: biological terrorism, OSHA recordability, Workers' Compensation issues, Federal ergonomic standards, bridging aerospace medicine and occupational health-especially in management of risk in spaceflight, and EAP operations with mission failures. A keynote address dealt with resiliency skills for 21st century workers and two NASA astronaut speakers highlighted a tour of the Johnson Space Center. During discipline specific breakout sessions, current issues in occupational health management and policy, credentialing and privileging, health risk assessment, measurement and standardization, audits, database development, prevention and rehabilitation, international travel and infection control, employee assistance, nursing process, and environmental health were presented.

  10. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  11. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  12. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  13. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  14. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  15. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    NASA Astrophysics Data System (ADS)

    Newman, L.; Hejduk, M.; Frigm, R.; Duncan, M.

    2014-09-01

    On-orbit collisions pose a significant mission risk to satellites operating in the space environment. Recognizing the likelihood and consequence of on-orbit collisions, NASA has taken several proactive measures to mitigate the risk of both a catastrophic loss of mission and the increase in the space debris population. In fall 2004, NASA GSFC established an Agency-wide, institutionalized process and service for identifying and reacting to predicted close approaches. The team responsible for executing this mission is the NASA Robotic Conjunction Assessment Risk Analysis (CARA) team. By fall 2005, this process had resulted in the execution of the first collision avoidance maneuver by a NASA unmanned satellite. In February 2008, NASA adopted a policy, documented in NASA Procedural Requirement 8715.6a Process for Limiting Orbital Debris that directed maneuverable satellites to have such an on-orbit collision mitigation process. In 2009, NASA decided to require support for all operational satellites. By January 2014, the CARA team has processed nearly 500,000 close approach messages from the Joint Space Operations Center (JSpOC) and has assisted our mission customers with planning and executing over 75 collision avoidance maneuvers for unmanned satellites in LEO, GEO, and HEO orbital regimes. With the increase in number of operational missions supported; growth in the orbital debris environment due to events such as the intentional destruction of the Fengyun 1-C satellite in 2007 and collision between Iridium-33 and Cosmos-2251; and improvements to the United States Space Surveillance Network (SSN) and its ability to track, catalog, and screen against small debris objects, the demands on the CARA process have consequently required the CARA Concept of Operations (CONOPS) to evolve to manage those demands. This evolution is centered on the ability to effectively and efficiently manage JSpOC, CARA, and Mission Operations resources, applying operational and analytical

  16. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  17. Probabilistic Risk Analysis of Groundwater Related Problems in Subterranean Excavation Sites

    NASA Astrophysics Data System (ADS)

    Sanchez-Vila, X.; Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Bolster, D.; Fernandez-Garcia, D.; Tartakovsky, D. M.

    2009-12-01

    Construction of subterranean excavations in densely populated areas is inherently hazardous. The number of construction sites (e.g., subway lines, railways and highway tunnels) has increased in recent years. These sites can pose risks to workers at the site as well as cause damage to surrounding buildings. The presence of groundwater makes the excavation even more complicated. We develop a probabilistic risk assessment (PRA) model o estimate the likelihood of occurrence of certain risks during a subway station construction. While PRA is widely used in many engineering fields, its applications to the underground constructions in general and to an underground station construction in particular are scarce if not nonexistent. This method enables us not only to evaluate the probability of failure, but also to quantify the uncertainty of the different events considered. The risk analysis was carried out using a fault tree analysis that made it possible to study a complex system in a structured and straightforward manner. As an example we consider an underground station for the new subway line in the Barcelona metropolitan area (Línia 9) through the town of Prat de Llobregat in the Llobregat River Delta, which is currently under development. A typical station on the L9 line lies partially between the shallow and the main aquifer. Specifically, it is located in the middle layer which is made up of silts and clays. By presenting this example we aim to illustrate PRA as an effective methodology for estimating and minimising risks and to demonstrate its utility as a potential tool for decision making.

  18. Improvements to the Ionizing Radiation Risk Assessment Program for NASA Astronauts

    NASA Technical Reports Server (NTRS)

    Semones, E. J.; Bahadori, A. A.; Picco, C. E.; Shavers, M. R.; Flores-McLaughlin, J.

    2011-01-01

    To perform dosimetry and risk assessment, NASA collects astronaut ionizing radiation exposure data from space flight, medical imaging and therapy, aviation training activities and prior occupational exposure histories. Career risk of exposure induced death (REID) from radiation is limited to 3 percent at a 95 percent confidence level. The Radiation Health Office at Johnson Space Center (JSC) is implementing a program to integrate the gathering, storage, analysis and reporting of astronaut ionizing radiation dose and risk data and records. This work has several motivations, including more efficient analyses and greater flexibility in testing and adopting new methods for evaluating risks. The foundation for these improvements is a set of software tools called the Astronaut Radiation Exposure Analysis System (AREAS). AREAS is a series of MATLAB(Registered TradeMark)-based dose and risk analysis modules that interface with an enterprise level SQL Server database by means of a secure web service. It communicates with other JSC medical and space weather databases to maintain data integrity and consistency across systems. AREAS is part of a larger NASA Space Medicine effort, the Mission Medical Integration Strategy, with the goal of collecting accurate, high-quality and detailed astronaut health data, and then securely, timely and reliably presenting it to medical support personnel. The modular approach to the AREAS design accommodates past, current, and future sources of data from active and passive detectors, space radiation transport algorithms, computational phantoms and cancer risk models. Revisions of the cancer risk model, new radiation detection equipment and improved anthropomorphic computational phantoms can be incorporated. Notable hardware updates include the Radiation Environment Monitor (which uses Medipix technology to report real-time, on-board dosimetry measurements), an updated Tissue-Equivalent Proportional Counter, and the Southwest Research Institute

  19. Analysis and probabilistic risk assessment of bioaccessible arsenic in polished and husked jasmine rice sold in Bangkok.

    PubMed

    Hensawang, Supanad; Chanpiwat, Penradee

    2018-09-01

    Food is one of the major sources of arsenic (As) exposure in humans. The objectives of this study were to determine the bioaccessible concentration of As in rice grain sold in Bangkok and to evaluate the potential health risks associated with rice consumption. Polished (n = 32) and husked (n = 17) jasmine rice were collected from local markets. In vitro digestion was performed to determine the bioaccessible As concentrations, which were used for probabilistic health risk assessments in different age groups of the population. Approximately 43.0% and 44.4% of the total As in the grain of polished and husked rice, respectively, was in the form of bioaccessible As. Significantly higher bioaccessible As concentrations were found in husked rice than in polished rice (1.5-3.8 times greater). The concentrations of bioaccessible As in polished and husked rice were lower than the Codex standard for As in rice. The average daily dose of As via rice consumption is equivalent to the daily ingestion of 2 L of water containing approximately 3.2-7.2 μg L -1 of As. Approximately 0.2%-13.7% and 10.7%-55.3% of the population may experience non-carcinogenic effects from polished and husked rice consumption, respectively. Approximately 1%-11.6% of children and 74.1%-99.8% of adults were at risk of cancer. The maximum cancer probabilities were 3 children and 6 adults in 10,000 individuals. The probabilistic risk results indicated that children and adults were at risk of both non-carcinogenic and carcinogenic effects from both types of rice consumption. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  1. Probabilistic risk analysis of mercury intake via food consumption in Spain.

    PubMed

    Moreno-Ortega, Alicia; Moreno-Rojas, Rafael; Martínez-Álvarez, Jesús Román; González Estecha, Montserrat; Castro González, Numa Pompilio; Amaro López, Manuel Ángel

    2017-09-01

    In Spain, recently, the public institutions have given information to the population in relation to fish consumption and the risk that it poses to health from the ingestion of mercury supposedly contained in the fish. At the same time, several scientific societies have published various works in this direction. All this without there being, up to now, any study on the evaluation of a probabilistic risk from mercury due to fish and seafood intake in Spain, which is the objective of this present work. For that purpose, we took individual data from a survey of the total diet of 3000 people, whose consumption of the principal fish and seafood species (49) was estimated. We compiled individualized data (2000) on the total mercury content of those species, which were completed and validated with bibliographic statistical data. After estimating the distributions of each fish and seafood species, both of their consumption and their mercury content, a simulation was made of the distribution of mercury ingestion from fish and seafood offered by 2.6% of the Spanish population at risk of exceeding total mercury recommendations, and between 12.2% and 21.2% of those exceeding methylmercury ones. The main species responsible were tuna fish, swordfish and hake, and significant differences were identified in fish consumption between sexes and ages, although, in the risk percentage, what stands out is an increase in the latter with an increase in age. Copyright © 2017 Elsevier GmbH. All rights reserved.

  2. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  3. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.

    PubMed

    Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-02-24

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.

  4. Increased Cancer Mortality Risk for NASA's ISS Astronauts: The Contribution of Diagnostic Radiological Examinations

    NASA Technical Reports Server (NTRS)

    Dodge, C.W.; Picco, C. E.; Gonzalez, S. M.; Johnston, S. L.; Van Baalen, M.; Shavers, M.R.

    2009-01-01

    This viewgraph presentation reviews the radiation exposures and risks associated with long-term spaceflight on the International Space Station. NASA's risk model of cancer mortality is also presented.

  5. Using Probabilistic Methods in Water Scarcity Assessments: A First Step Towards a Water Scarcity Risk Assessment Framework

    NASA Technical Reports Server (NTRS)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Phillip

    2016-01-01

    Water scarcity -driven by climate change, climate variability, and socioeconomic developments- is recognized as one of the most important global risks, both in terms of likelihood and impact. Whilst a wide range of studies have assessed the role of long term climate change and socioeconomic trends on global water scarcity, the impact of variability is less well understood. Moreover, the interactions between different forcing mechanisms, and their combined effect on changes in water scarcity conditions, are often neglected. Therefore, we provide a first step towards a framework for global water scarcity risk assessments, applying probabilistic methods to estimate water scarcity risks for different return periods under current and future conditions while using multiple climate and socioeconomic scenarios.

  6. Case Studies in NASA High-Technology Risk Assessment and Management

    NASA Technical Reports Server (NTRS)

    Lambright, W. Henry

    1998-01-01

    This study discusses the approach of NASA managers in the assessment of risk in three critical decisions: the Apollo 8 decision to orbit the Moon in 1968, the servicing of the Hubble Space Telescope in 1993, and the privitization of the Space Shuttle in the latter 1990s.

  7. Probabilistic framework for assessing the arsenic exposure risk from cooked fish consumption.

    PubMed

    Ling, Min-Pei; Wu, Chiu-Hua; Chen, Szu-Chieh; Chen, Wei-Yu; Chio, Chia-Pin; Cheng, Yi-Hsien; Liao, Chung-Min

    2014-12-01

    Geogenic arsenic (As) contamination of groundwater is a major ecological and human health problem in southwestern and northeastern coastal areas of Taiwan. Here, we present a probabilistic framework for assessing the human health risks from consuming raw and cooked fish that were cultured in groundwater As-contaminated ponds in Taiwan by linking a physiologically based pharmacokinetics model and a Weibull dose-response model. Results indicate that As levels in baked, fried, and grilled fish were higher than those of raw fish. Frying resulted in the greatest increase in As concentration, followed by grilling, with baking affecting the As concentration the least. Simulation results show that, following consumption of baked As-contaminated fish, the health risk to humans is <10(-6) excess bladder cancer risk level for lifetime exposure; as the incidence ratios of liver and lung cancers are generally acceptable at risk ranging from 10(-6) to 10(-4), the consumption of baked As-contaminated fish is unlikely to pose a significant risk to human health. However, contaminated fish cooked by frying resulted in significant health risks, showing the highest cumulative incidence ratios of liver cancer. We also show that males have higher cumulative incidence ratio of liver cancer than females. We found that although cooking resulted in an increase for As levels in As-contaminated fish, the risk to human health of consuming baked fish is nevertheless acceptable. We suggest the adoption of baking as a cooking method and warn against frying As-contaminated fish. We conclude that the concentration of contaminants after cooking should be taken into consideration when assessing the risk to human health.

  8. Monte Carlo Simulation of Markov, Semi-Markov, and Generalized Semi- Markov Processes in Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    English, Thomas

    2005-01-01

    A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.

  9. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  10. Integrated Risk Management Within NASA Programs/Projects

    NASA Technical Reports Server (NTRS)

    Connley, Warren; Rad, Adrian; Botzum, Stephen

    2004-01-01

    As NASA Project Risk Management activities continue to evolve, the need to successfully integrate risk management processes across the life cycle, between functional disciplines, stakeholders, various management policies, and within cost, schedule and performance requirements/constraints become more evident and important. Today's programs and projects are complex undertakings that include a myriad of processes, tools, techniques, management arrangements and other variables all of which must function together in order to achieve mission success. The perception and impact of risk may vary significantly among stakeholders and may influence decisions that may have unintended consequences on the project during a future phase of the life cycle. In these cases, risks may be unintentionally and/or arbitrarily transferred to others without the benefit of a comprehensive systemic risk assessment. Integrating risk across people, processes, and project requirements/constraints serves to enhance decisions, strengthen communication pathways, and reinforce the ability of the project team to identify and manage risks across the broad spectrum of project management responsibilities. The ability to identify risks in all areas of project management increases the likelihood a project will identify significant issues before they become problems and allows projects to make effective and efficient use of shrinking resources. By getting a total team integrated risk effort, applying a disciplined and rigorous process, along with understanding project requirements/constraints provides the opportunity for more effective risk management. Applying an integrated approach to risk management makes it possible to do a better job at balancing safety, cost, schedule, operational performance and other elements of risk. This paper will examine how people, processes, and project requirements/constraints can be integrated across the project lifecycle for better risk management and ultimately improve the

  11. Probabilistic risk assessment of exposure to leucomalachite green residues from fish products.

    PubMed

    Chu, Yung-Lin; Chimeddulam, Dalaijamts; Sheen, Lee-Yan; Wu, Kuen-Yuh

    2013-12-01

    To assess the potential risk of human exposure to carcinogenic leucomalachite green (LMG) due to fish consumption, the probabilistic risk assessment was conducted for adolescent, adult and senior adult consumers in Taiwan. The residues of LMG with the mean concentration of 13.378±20.56 μg kg(-1) (BFDA, 2009) in fish was converted into dose, considering fish intake reported for three consumer groups by NAHSIT (1993-1996) and body weight of an average individual of the group. The lifetime average and high 95th percentile dietary intakes of LMG from fish consumption for Taiwanese consumers were estimated at up to 0.0135 and 0.0451 μg kg-bw(-1) day(-1), respectively. Human equivalent dose (HED) of 2.875 mg kg-bw(-1) day(-1) obtained from a lower-bound benchmark dose (BMDL10) in mice by interspecies extrapolation was linearly extrapolated to oral cancer slope factor (CSF) of 0.035 (mgkg-bw(-1)day(-1))(-1) for humans. Although, the assumptions and methods are different, the results of lifetime cancer risk varying from 3×10(-7) to 1.6×10(-6) were comparable to those of margin of exposures (MOEs) varying from 410,000 to 4,800,000. In conclusions, Taiwanese fish consumers with the 95th percentile LADD of LMG have greater risk of liver cancer and need to an action of risk management in Taiwan. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    2002-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.

  13. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  14. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  15. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Crocker, Andrew M.; Doering, Kimberly B; Meadows, Robert G.; Lariviere, Brian W.; Graham, Jerry B.

    2015-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS; and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. For NASA's SLS ABEDRR procurement, Dynetics and AR formed a team to offer a series of full-scale risk mitigation hardware demonstrations for an affordable booster approach that meets the evolved capabilities of the SLS. To establish a basis for the risk reduction activities, the Dynetics Team developed a booster design that takes advantage of the flight-proven Apollo-Saturn F-1. Using NASA's vehicle assumptions for the SLS Block 2, a two-engine, F-1-based booster design delivers 150 mT (331 klbm) payload to LEO, 20 mT (44 klbm) above NASA's requirements. This enables a low-cost, robust approach to structural design. During the ABEDRR effort, the Dynetics Team has modified proven Apollo-Saturn components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the

  16. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil

    PubMed Central

    Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-01-01

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315

  17. Enhancement of the Probabilistic CEramic Matrix Composite ANalyzer (PCEMCAN) Computer Code

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2000-01-01

    This report represents a final technical report for Order No. C-78019-J entitled "Enhancement of the Probabilistic Ceramic Matrix Composite Analyzer (PCEMCAN) Computer Code." The scope of the enhancement relates to including the probabilistic evaluation of the D-Matrix terms in MAT2 and MAT9 material properties card (available in CEMCAN code) for the MSC/NASTRAN. Technical activities performed during the time period of June 1, 1999 through September 3, 1999 have been summarized, and the final version of the enhanced PCEMCAN code and revisions to the User's Manual is delivered along with. Discussions related to the performed activities were made to the NASA Project Manager during the performance period. The enhanced capabilities have been demonstrated using sample problems.

  18. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  19. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  20. Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework

    USGS Publications Warehouse

    Sojda, Richard S.; Towler, Erin; Roberts, Mike; Rajagopalan, Balaji

    2013-01-01

    [1] Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very “sharp”). Synthetic forecasts show that a modest “sharpening” can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.

  1. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  2. Environmental risk assessment of white phosphorus from the use of munitions - a probabilistic approach.

    PubMed

    Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager

    2010-03-15

    White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.

  3. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR

  4. LISA Technology Development and Risk Reduction at NASA

    NASA Technical Reports Server (NTRS)

    Stebbins, Robin T.

    2010-01-01

    The Laser Interferometer Space Antenna (LISA) is a joint ESA-NASA project to design, build and operate a space-based gravitational wave detector based on a laser interferometer. LISA relies on several technologies that are either new to spaceflight or must perform at levels not previously demonstrated in a spaceflight environment. The ESA-led LISA Pathfinder mission is the main effort to demonstrate LISA technology. NASA also supports complementary ground-based technology development and risk reduction activities. This presentation will report the status of NASA work on micronewton thrusters, the telescope, the optical pointing subsystem and mission formulation. More details on some of these topics will be given in posters. Other talks and posters will describe NASA-supported work on the laser subsystem, the phasemeter, and aspects of the interferometry. Two flight-qualified clusters of four colloid micronewton thrusters, each capable of thrust Levels between 5 and 30 microNewton with a resolution less than 0.l microNewton and a thrust noise less than 0.1 microNewton/vHz (0.001 to 4 Hz), have been integrated onto the LISA Pathfinder spacecraft. The complementary ground-based development focuses on lifetime demonstration. Laboratory verification of failure models and accelerated life tests are just getting started. LISA needs a 40 cm diameter, afocal telescope for beam expansion/reduction that maintains an optical pathlength stability of approximately 1 pm/vHz in an extremely stable thermal environment. A mechanical prototype of a silicon carbide primary-secondary structure has been fabricated for stability testing. Two optical assemblies must point at different distant spacecraft with nanoradian accuracy over approximately 1 degree annual variation in the angle between the distant spacecraft. A candidate piezo-inchworm actuator is being tested in a suitable testbed. In addition to technology development, NASA has carried out several studies in support of the

  5. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or

  6. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. JOe; Ronald L. Boring

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less

  7. Probabilistic verification of cloud fraction from three different products with CALIPSO

    NASA Astrophysics Data System (ADS)

    Jung, B. J.; Descombes, G.; Snyder, C.

    2017-12-01

    In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.

  8. A screening level probabilistic ecological risk assessment of PAHs in sediments of San Francisco Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Febbo, E.J.; Arnold, W.R.; Biddinger, G.R.

    1995-12-31

    As part of the Regional Monitoring Program administered by the San Francisco Estuary Institute (SFEI), sediment samples were collected at 20 stations in San Francisco Bay and analyzed to determine concentrations of 43 PAHs. These data were obtained from SFEI and used to calculate the potential risk to aquatic organisms using probabilistic modeling and Monte Carlo statistical procedures. Sediment chemistry data were used in conjunction with a sediment equilibrium model, a bioconcentration model, biota-sediment accumulation factors, and critical body burden effects concentrations to assess potential risk to bivalves. Bivalves were the chosen receptors because they lack a well-developed enzymatic systemmore » for metabolizing PAHs. Thus, they more readily accumulate PAHs and represent a species at greater risk than other taxa, such as fish and crustaceans. PAHs considered in this study span a broad range of octanol-water partition coefficients. Results indicate that risk of non-polar narcotic effects from PAHs was low in the Northern Bay Area, but higher in the South Bay near the more urbanized sections of the drainage basin.« less

  9. Guidelines for developing NASA (National Aeronautics and Space Administration) ADP security risk management plans

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1983-01-01

    This report presents guidance to NASA Computer security officials for developing ADP security risk management plans. The six components of the risk management process are identified and discussed. Guidance is presented on how to manage security risks that have been identified during a risk analysis performed at a data processing facility or during the security evaluation of an application system.

  10. Application of Probabilistic Risk Assessment (PRA) During Conceptual Design for the NASA Orbital Space Plane (OSP)

    NASA Technical Reports Server (NTRS)

    Rogers, James H.; Safie, Fayssal M.; Stott, James E.; Lo, Yunnhon

    2004-01-01

    In order to meet the space transportation needs for a new century, America's National Aeronautics and Space Administration (NASA) has implemented an Integrated Space Transportation Plan to produce safe, economical, and reliable access to space. One near term objective of this initiative is the design and development of a next-generation vehicle and launch system that will transport crew and cargo to and from the International Space Station (ISS), the Orbital Space Plane (OSP). The OSP system is composed of a manned launch vehicle by an existing Evolved Expendable Launch Vehicle (EELV). The OSP will provide emergency crew rescue from the ISS by 2008, and provide crew and limited cargo transfer to and from the ISS by 2012. A key requirement is for the OSP to be safer and more reliable than the Soyuz and Space Shuttle, which currently provide these capabilities.

  11. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    PubMed

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  13. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Crocker, Andrew M.; Greene, William D.

    2017-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. During the ABEDRR effort, the Dynetics Team has modified flight-proven Apollo-Saturn F-1 engine components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the objectives of this work are to demonstrate combustion stability and measure performance of a 500,000 lbf class Oxidizer-Rich Staged Combustion (ORSC) cycle main injector. A trade study was completed to investigate the feasibility, cost effectiveness, and technical maturity of a domestically-produced engine that could potentially both replace the RD-180 on Atlas V and satisfy NASA SLS payload-to-orbit requirements via an advanced booster application. Engine physical dimensions and performance parameters resulting from this study provide the system level requirements for the ORSC risk reduction test article

  14. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  15. Probabilistic risk assessment of the effect of acidified seawater on development stages of sea urchin (Strongylocentrotus droebachiensis).

    PubMed

    Chen, Wei-Yu; Lin, Hsing-Chieh

    2018-05-01

    Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.

  16. A Probabilistic Tool that Aids Logistics Engineers in the Establishment of High Confidence Repair Need-Dates at the NASA Shuttle Logistics Depot

    NASA Technical Reports Server (NTRS)

    Bullington, J. V.; Winkler, J. C.; Linton, D. G.; Khajenoori, S.

    1995-01-01

    The NASA Shuttle Logistics Depot (NSLD) is tasked with the responsibility for repair and manufacture of Line Replaceable Unit (LRU) hardware and components to support the Space Shuttle Orbiter. Due to shrinking budgets, cost effective repair of LRU's becomes a primary objective. To achieve this objective, is imperative that resources be assigned to those LRU's which have the greatest expectation of being needed as a spare. Forecasting the times at which spares are needed requires consideration of many significant factors including: failure rate, flight rate, spares availability, and desired level of support, among others. This paper summarizes the results of the research and development work that has been accomplished in producing an automated tool that assists in the assignment of effective repair start-times for LRU's at the NSLD. This system, called the Repair Start-time Assessment System (RSAS), uses probabilistic modeling technology to calculate a need date for a repair that considers the current repair pipeline status, as well as, serviceable spares and projections of future demands. The output from the system is a date for beginning the repair that has significantly greater confidence (in the sense that a desired probability of support is ensured) than times produced using other techniques. Since an important output of RSAS is the longest repair turn-around time that will ensure a desired probability of support, RSAS has the potential for being applied to operations at any repair depot where spares are on-hand and repair start-times are of interest. In addition, RSAS incorporates tenants of Just-in-Time (JIT) techniques in that the latest repair start-time (i.e., the latest time at which repair resources must be committed) may be calculated for every failed unit This could reduce the spares inventory for certain items, without significantly increasing the risk of unsatisfied demand.

  17. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  18. NASA's Space Launch System Advanced Booster Engineering Demonstration and Risk Reduction Efforts

    NASA Technical Reports Server (NTRS)

    Crumbly, Christopher M.; May, Todd; Dumbacher, Daniel

    2012-01-01

    The National Aeronautics and Space Administration (NASA) formally initiated the Space Launch System (SLS) development in September 2011, with the approval of the program s acquisition plan, which engages the current workforce and infrastructure to deliver an initial 70 metric ton (t) SLS capability in 2017, while using planned block upgrades to evolve to a full 130 t capability after 2021. A key component of the acquisition plan is a three-phased approach for the first stage boosters. The first phase is to complete the development of the Ares and Space Shuttle heritage 5-segment solid rocket boosters for initial exploration missions in 2017 and 2021. The second phase in the booster acquisition plan is the Advanced Booster Risk Reduction and/or Engineering Demonstration NASA Research Announcement (NRA), which was recently awarded after a full and open competition. The NRA was released to industry on February 9, 2012, and its stated intent was to reduce risks leading to an affordable Advanced Booster and to enable competition. The third and final phase will be a full and open competition for Design, Development, Test, and Evaluation (DDT&E) of the Advanced Boosters. There are no existing boosters that can meet the performance requirements for the 130 t class SLS. The expected thrust class of the Advanced Boosters is potentially double the current 5-segment solid rocket booster capability. These new boosters will enable the flexible path approach to space exploration beyond Earth orbit, opening up vast opportunities including near-Earth asteroids, Lagrange Points, and Mars. This evolved capability offers large volume for science missions and payloads, will be modular and flexible, and will be right-sized for mission requirements. NASA developed the Advanced Booster Engineering Demonstration and/or Risk Reduction NRA to seek industry participation in reducing risks leading to an affordable Advanced Booster that meets the SLS performance requirements. Demonstrations and

  19. Mind the Gap: Exploring the Underground of the NASA Space Cancer Risk Model

    NASA Technical Reports Server (NTRS)

    Chappell, L. J.; Elgart, S. R.; Milder, C. M.; Shavers, M. R.; Semones, E. J.; Huff, J. L.

    2017-01-01

    The REID quantifies the lifetime risk of death from radiation-induced cancer in an exposed astronaut. The NASA Space Cancer Risk (NSCR) 2012 mode incorporates elements from physics, biology, epidemiology, and statistics to generate the REID distribution. The current model quantifies the space radiation environment, radiation quality, and dose-rate effects to estimate a NASA-weighted dose. This weighted dose is mapped to the excess risk of radiation-induced cancer mortality from acute exposures to gamma rays and then transferred to an astronaut population. Finally, the REID is determined by integrating this risk over the individual's lifetime. The calculated upper 95% confidence limit of the REID is used to restrict an astronaut's permissible mission duration (PMD) for a proposed mission. As a statistical quantity characterized by broad, subjective uncertainties, REID estimates for space missions result in wide distributions. Currently, the upper 95% confidence level is over 350% larger than the mean REID value, which can severely limit an astronaut's PMD. The model incorporates inputs from multiple scientific disciplines in the risk estimation process. Physics and particle transport models calculate how radiation moves through space, penetrates spacecraft, and makes its way to the human beings onboard. Epidemiological studies of exposures from atomic bombings, medical treatments, and power plants are used to quantify health risks from acute and chronic low linear energy transfer (LET) ionizing radiation. Biological studies in cellular and animal models using radiation at various LETs and energies inform quality metrics for ions present in space radiation. Statistical methodologies unite these elements, controlling for mathematical and scientific uncertainty and variability. Despite current progress, these research platforms contain knowledge gaps contributing to the large uncertainties still present in the model. The NASA Space Radiation Program Element (SRPE

  20. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  1. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  2. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  3. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  4. Assessment of possible airborne impact from nuclear risk sites - Part II: probabilistic analysis of atmospheric transport patterns in Euro-Arctic region

    NASA Astrophysics Data System (ADS)

    Mahura, A. G.; Baklanov, A. A.

    2003-10-01

    The probabilistic analysis of atmospheric transport patterns from most important nuclear risk sites in the Euro-Arctic region is performed employing the methodology developed within the "Arctic Risk" Project of the NARP Programme (Baklanov and Mahura, 2003). The risk sites are the nuclear power plants in the Northwest Russia, Finland, Sweden, Lithuania, United Kingdom, and Germany as well as the Novaya Zemlya test site of Russia. The geographical regions of interest are the Northern and Central European countries and Northwest Russia. In this study, the employed research tools are the trajectory model to calculate a multiyear dataset of forward trajectories that originated over the risk site locations, and a set of statistical methods (including exploratory, cluster, and probability fields analyses) for analysis of trajectory modelling results. The probabilistic analyses of trajectory modelling results for eleven sites are presented as a set of various indicators of the risk sites possible impact on geographical regions and countries of interest. The nuclear risk site possible impact (on a particular geographical region, territory, country, site, etc.) due to atmospheric transport from the site after hypothetical accidental release of radioactivity can be properly estimated based on a combined interpretation of the indicators (simple characteristics, atmospheric transport pathways, airflow and fast transport probability fields, maximum reaching distance and maximum possible impact zone, typical transport time and precipitation factor fields) for different time periods (annual, seasonal, and monthly) for any selected site (both separately for each site or grouped for several sites) in the Euro-Arctic region. Such estimation could be the useful input information for the decision-making process, risk assessment, and planning of emergency response systems for sites of nuclear, chemical, and biological danger.

  5. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  6. Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions

    NASA Technical Reports Server (NTRS)

    Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.

    2014-01-01

    Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.

  7. Cardiovascular Disease Risk in NASA Astronauts Across the Lifespan: Historical Cohort Studies

    NASA Technical Reports Server (NTRS)

    Charvat, Jacqueline M.; Lee, Stuart M. C.; Davenport, Eddie; Barlow, Carolyn E.; Radford, Nina B.; De Fina, Laura F.; Stenger, Michael B.; Van Baalen, Mary

    2017-01-01

    Acute effects of spaceflight on the cardiovascular system have been studied extensively, but the combined chronic effects of spaceflight and aging are not well understood. Preparation for and participation in space flight activities are potentially associated with cardiovascular disease risk factors (e.g., altered dietary and exercise habits, physical and emotional stress, circadian shifts, radiation). Further, astronauts who travel into space multiple times may be at an increased risk across their lifespan. However, comparing the risk of cardiovascular disease in astronauts to other large cohorts is difficult. For example, comparisons between astronauts and large national cohorts, such as the National Health and Nutrition Examination Survey and the National Health Information Survey, are hampered by significant differences in health status between astronauts and the general population, and most of these national studies fail to provide longitudinal data on population health. To address those limitations, NASA's Longitudinal Study of Astronaut Health previously sought to compare the astronauts to a cohort of civil servants employed at the Johnson Space Center. However, differences between the astronauts and civil servants at the beginning of the study, as well as differential follow up, limited the ability to interpret the results. To resolve some of these limitations, two unique cohorts of healthy workers, U.S. Air Force aviators and Cooper Center Longitudinal Study participants, have been identified as potential comparison populations for the astronaut corps. The Air Force cohort was chosen due to similarities in health at selection, screening, and some occupational exposures that Air Force aviators endure, many of which mirror that of the astronaut corps. The Cooper Clinic cohort, a generally healthy prevention cohort, was chosen for the vast array of clinical cardiovascular measures collected in a longitudinal manner complementary to those collected on

  8. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    NASA Technical Reports Server (NTRS)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  9. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  10. Comparing probabilistic microbial risk assessments for drinking water against daily rather than annualised infection probability targets.

    PubMed

    Signor, R S; Ashbolt, N J

    2009-12-01

    Some national drinking water guidelines provide guidance on how to define 'safe' drinking water. Regarding microbial water quality, a common position is that the chance of an individual becoming infected by some reference waterborne pathogen (e.g. Cryptsporidium) present in the drinking water should < 10(-4) in any year. However the instantaneous levels of risk to a water consumer vary over the course of a year, and waterborne disease outbreaks have been associated with shorter-duration periods of heightened risk. Performing probabilistic microbial risk assessments is becoming commonplace to capture the impacts of temporal variability on overall infection risk levels. A case is presented here for adoption of a shorter-duration reference period (i.e. daily) infection probability target over which to assess, report and benchmark such risks. A daily infection probability benchmark may provide added incentive and guidance for exercising control over short-term adverse risk fluctuation events and their causes. Management planning could involve outlining measures so that the daily target is met under a variety of pre-identified event scenarios. Other benefits of a daily target could include providing a platform for managers to design and assess management initiatives, as well as simplifying the technical components of the risk assessment process.

  11. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  12. Independent Review of U.S. and Russian Probabilistic Risk Assessments for the International Space Station Mini Research Module #2 Micrometeoroid and Orbital Debris Risk

    NASA Technical Reports Server (NTRS)

    Squire, Michael D.

    2011-01-01

    The Mini-Research Module-2 (MRM-2), a Russian module on the International Space Station, does not meet its requirements for micrometeoroid and orbital debris probability of no penetration (PNP). To document this condition, the primary Russian Federal Space Agency ISS contractor, S.P. Korolev Rocket and Space Corporation-Energia (RSC-E), submitted an ISS non-compliance report (NCR) which was presented at the 5R Stage Operations Readiness Review (SORR) in October 2009. In the NCR, RSC-E argued for waiving the PNP requirement based on several factors, one of which was the risk of catastrophic failure was acceptably low at 1 in 11,100. However, NASA independently performed an assessment of the catastrophic risk resulting in a value of 1 in 1380 and believed that the risk at that level was unacceptable. The NASA Engineering and Safety Center was requested to evaluate the two competing catastrophic risk values and determine which was more accurate. This document contains the outcome of the assessment.

  13. The Experimental Breeder Reactor II seismic probabilistic risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roglans, J; Hill, D J

    1994-02-01

    The Experimental Breeder Reactor II (EBR-II) is a US Department of Energy (DOE) Category A research reactor located at Argonne National Laboratory (ANL)-West in Idaho. EBR-II is a 62.5 MW-thermal Liquid Metal Reactor (LMR) that started operation in 1964 and it is currently being used as a testbed in the Integral Fast Reactor (IFR) Program. ANL has completed a Level 1 Probabilistic Risk Assessment (PRA) for EBR-II. The Level 1 PRA for internal events and most external events was completed in June 1991. The seismic PRA for EBR-H has recently been completed. The EBR-II reactor building contains the reactor, themore » primary system, and the decay heat removal systems. The reactor vessel, which contains the core, and the primary system, consisting of two primary pumps and an intermediate heat exchanger, are immersed in the sodium-filled primary tank, which is suspended by six hangers from a beam support structure. Three systems or functions in EBR-II were identified as the most significant from the standpoint of risk of seismic-induced fuel damage: (1) the reactor shutdown system, (2) the structural integrity of the passive decay heat removal systems, and (3) the integrity of major structures, like the primary tank containing the reactor that could threaten both the reactivity control and decay heat removal functions. As part of the seismic PRA, efforts were concentrated in studying these three functions or systems. The passive safety response of EBR-II reactor -- both passive reactivity shutdown and passive decay heat removal, demonstrated in a series of tests in 1986 -- was explicitly accounted for in the seismic PRA as it had been included in the internal events assessment.« less

  14. Incorporating psychological influences in probabilistic cost analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies

  15. Review of NASA's Evidence Reports on Human Health Risks. 2015 Letter Report

    NASA Technical Reports Server (NTRS)

    Scott-Conner, Carol E. H.; Masys, Daniel R.; Liverman, Catharyn T.

    2016-01-01

    NASA has requested a study from the Institute of Medicine (IOM) to provide an independent review of more than 30 evidence reports on human health risks for long duration and exploration spaceflight. The evidence reports, which are publicly available, are categorized into five broad categories: (1) behavioral health and performance; (2) human health countermeasures (with a focus on bone metabolism and orthopedics, nutrition, immunology, and cardiac and pulmonary physiology); (3) radiation; (4) human factors issues; and (5) exploration medical capabilities. The reports are revised on an ongoing basis to incorporate new scientific information. In conducting this study, an IOM ad hoc committee will build on the 2008 IOM report Review of NASA's Human Research Program Evidence Books. That report provided an assessment of the process used for developing the evidence reports and provided an initial review of the evidence reports that had been completed at that time. Each year, NASA staff will identify a set of evidence reports for committee review. Over the course of the study all evidence reports will be reviewed. The committee will hold an annual scientific workshop to receive input on the evidence reports it is reviewing that year and an update on the recent literature. The committee will issue an annual letter report that addresses the following questions relevant to each evidence report: 1. Does the evidence report provide sufficient evidence, as well as sufficient risk context, that the risk is of concern for long-term space missions? 2. Does the evidence report make the case for the research gaps presented? 3. Are there any additional gaps in knowledge or areas of fundamental research that should be considered to enhance the basic understanding of this specific risk? 4. Does the evidence report address relevant interactions among risks? 5. Is input from additional disciplines needed? 6. Is the breadth of the cited literature sufficient? 7. What is the overall

  16. Internal NASA Study: NASAs Protoflight Research Initiative

    NASA Technical Reports Server (NTRS)

    Coan, Mary R.; Hirshorn, Steven R.; Moreland, Robert

    2015-01-01

    The NASA Protoflight Research Initiative is an internal NASA study conducted within the Office of the Chief Engineer to better understand the use of Protoflight within NASA. Extensive literature reviews and interviews with key NASA members with experience in both robotic and human spaceflight missions has resulted in three main conclusions and two observations. The first conclusion is that NASA's Protoflight method is not considered to be "prescriptive." The current policies and guidance allows each Program/Project to tailor the Protoflight approach to better meet their needs, goals and objectives. Second, Risk Management plays a key role in implementation of the Protoflight approach. Any deviations from full qualification will be based on the level of acceptable risk with guidance found in NPR 8705.4. Finally, over the past decade (2004 - 2014) only 6% of NASA's Protoflight missions and 6% of NASA's Full qualification missions experienced a publicly disclosed mission failure. In other words, the data indicates that the Protoflight approach, in and of it itself, does not increase the mission risk of in-flight failure. The first observation is that it would be beneficial to document the decision making process on the implementation and use of Protoflight. The second observation is that If a Project/Program chooses to use the Protoflight approach with relevant heritage, it is extremely important that the Program/Project Manager ensures that the current project's requirements falls within the heritage design, component, instrument and/or subsystem's requirements for both the planned and operational use, and that the documentation of the relevant heritage is comprehensive, sufficient and the decision well documented. To further benefit/inform this study, a recommendation to perform a deep dive into 30 missions with accessible data on their testing/verification methodology and decision process to research the differences between Protoflight and Full Qualification

  17. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  18. Probabilistic Risk Assessment for Decision Making During Spacecraft Operations

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2009-01-01

    Decisions made during the operational phase of a space mission often have significant and immediate consequences. Without the explicit consideration of the risks involved and their representation in a solid model, it is very likely that these risks are not considered systematically in trade studies. Wrong decisions during the operational phase of a space mission can lead to immediate system failure whereas correct decisions can help recover the system even from faulty conditions. A problem of special interest is the determination of the system fault protection strategies upon the occurrence of faults within the system. Decisions regarding the fault protection strategy also heavily rely on a correct understanding of the state of the system and an integrated risk model that represents the various possible scenarios and their respective likelihoods. Probabilistic Risk Assessment (PRA) modeling is applicable to the full lifecycle of a space mission project, from concept development to preliminary design, detailed design, development and operations. The benefits and utilities of the model, however, depend on the phase of the mission for which it is used. This is because of the difference in the key strategic decisions that support each mission phase. The focus of this paper is on describing the particular methods used for PRA modeling during the operational phase of a spacecraft by gleaning insight from recently conducted case studies on two operational Mars orbiters. During operations, the key decisions relate to the commands sent to the spacecraft for any kind of diagnostics, anomaly resolution, trajectory changes, or planning. Often, faults and failures occur in the parts of the spacecraft but are contained or mitigated before they can cause serious damage. The failure behavior of the system during operations provides valuable data for updating and adjusting the related PRA models that are built primarily based on historical failure data. The PRA models, in turn

  19. Efficient Probabilistic Diagnostics for Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Chavira, Mark; Cascio, Keith; Poll, Scott; Darwiche, Adnan; Uckun, Serdar

    2008-01-01

    We consider in this work the probabilistic approach to model-based diagnosis when applied to electrical power systems (EPSs). Our probabilistic approach is formally well-founded, as it based on Bayesian networks and arithmetic circuits. We investigate the diagnostic task known as fault isolation, and pay special attention to meeting two of the main challenges . model development and real-time reasoning . often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-to-use speci.cation language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In essence, we introduce a high-level EPS speci.cation language from which Bayesian networks that can diagnose multiple simultaneous failures are auto-generated, and we illustrate the feasibility of using arithmetic circuits, compiled from Bayesian networks, for real-time diagnosis on real-world EPSs of interest to NASA. The experimental system is a real-world EPS, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. In experiments with the ADAPT Bayesian network, which currently contains 503 discrete nodes and 579 edges, we .nd high diagnostic accuracy in scenarios where one to three faults, both in components and sensors, were inserted. The time taken to compute the most probable explanation using arithmetic circuits has a small mean of 0.2625 milliseconds and standard deviation of 0.2028 milliseconds. In experiments with data from ADAPT we also show that arithmetic circuit evaluation substantially outperforms joint tree propagation and variable elimination, two alternative algorithms for diagnosis using Bayesian network inference.

  20. Experimental and analytical studies for the NASA carbon fiber risk assessment

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Various experimental and analytical studies performed for the NASA carbon fiber risk assessment program are described with emphasis on carbon fiber characteristics, sensitivity of electrical equipment and components to shorting or arcing by carbon fibers, attenuation effect of carbon fibers on aircraft landing aids, impact of carbon fibers on industrial facilities. A simple method of estimating damage from airborne carbon fibers is presented.

  1. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk

  2. Space Launch System NASA Research Announcement Advanced Booster Engineering Demonstration and/or Risk Reduction

    NASA Technical Reports Server (NTRS)

    Crumbly, Christopher M.; Craig, Kellie D.

    2011-01-01

    The intent of the Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) effort is to: (1) Reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS (2) Enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Key Concepts (1) Offerors must propose an Advanced Booster concept that meets SLS Program requirements (2) Engineering Demonstration and/or Risk Reduction must relate to the Offeror s Advanced Booster concept (3) NASA Research Announcement (NRA) will not be prescriptive in defining Engineering Demonstration and/or Risk Reduction

  3. Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report

    NASA Technical Reports Server (NTRS)

    Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick

    2009-01-01

    The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts

  4. NASA's Orbital Space Plane Risk Reduction Strategy

    NASA Technical Reports Server (NTRS)

    Dumbacher, Dan

    2003-01-01

    This paper documents the transformation of NASA s Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle Program under the revised Integrated Space Transportation Plan, announced November 2002. Outlining the technology development approach followed by the original SLI, this paper gives insight into the current risk-reduction strategy that will enable confident development of the Nation s first orbital space plane (OSP). The OSP will perform an astronaut and contingency cargo transportation function, with an early crew rescue capability, thus enabling increased crew size and enhanced science operations aboard the International Space Station. The OSP design chosen for full-scale development will take advantage of the latest innovations American industry has to offer. The OSP Program identifies critical technologies that must be advanced to field a safe, reliable, affordable space transportation system for U.S. access to the Station and low-Earth orbit. OSP flight demonstrators will test crew safety features, validate autonomous operations, and mature thermal protection systems. Additional enabling technologies may be identified during the OSP design process as part of an overall risk-management strategy. The OSP Program uses a comprehensive and evolutionary systems acquisition approach, while applying appropriate lessons learned.

  5. Integration of NASA-Developed Lifing Technology for PM Alloys into DARWIN (registered trademark)

    NASA Technical Reports Server (NTRS)

    McClung, R. Craig; Enright, Michael P.; Liang, Wuwei

    2011-01-01

    In recent years, Southwest Research Institute (SwRI) and NASA Glenn Research Center (GRC) have worked independently on the development of probabilistic life prediction methods for materials used in gas turbine engine rotors. The two organizations have addressed different but complementary technical challenges. This report summarizes a brief investigation into the current status of the relevant technology at SwRI and GRC with a view towards a future integration of methods and models developed by GRC for probabilistic lifing of powder metallurgy (P/M) nickel turbine rotor alloys into the DARWIN (Darwin Corporation) software developed by SwRI.

  6. Segmentation of risk structures for otologic surgery using the Probabilistic Active Shape Model (PASM)

    NASA Astrophysics Data System (ADS)

    Becker, Meike; Kirschner, Matthias; Sakas, Georgios

    2014-03-01

    Our research project investigates a multi-port approach for minimally-invasive otologic surgery. For planning such a surgery, an accurate segmentation of the risk structures is crucial. However, the segmentation of these risk structures is a challenging task: The anatomical structures are very small and some have a complex shape, low contrast and vary both in shape and appearance. Therefore, prior knowledge is needed which is why we apply model-based approaches. In the present work, we use the Probabilistic Active Shape Model (PASM), which is a more flexible and specific variant of the Active Shape Model (ASM), to segment the following risk structures: cochlea, semicircular canals, facial nerve, chorda tympani, ossicles, internal auditory canal, external auditory canal and internal carotid artery. For the evaluation we trained and tested the algorithm on 42 computed tomography data sets using leave-one-out tests. Visual assessment of the results shows in general a good agreement of manual and algorithmic segmentations. Further, we achieve a good Average Symmetric Surface Distance while the maximum error is comparatively large due to low contrast at start and end points. Last, we compare the PASM to the standard ASM and show that the PASM leads to a higher accuracy.

  7. NASA's Space Launch System Advanced Booster Engineering Demonstration and/or Risk Reduction Efforts

    NASA Technical Reports Server (NTRS)

    Crumbly, Christopher M.; Dumbacher, Daniel L.; May, Todd A.

    2012-01-01

    The National Aeronautics and Space Administration (NASA) formally initiated the Space Launch System (SLS) development in September 2011, with the approval of the program s acquisition plan, which engages the current workforce and infrastructure to deliver an initial 70 metric ton (t) SLS capability in 2017, while using planned block upgrades to evolve to a full 130 t capability after 2021. A key component of the acquisition plan is a three-phased approach for the first stage boosters. The first phase is to complete the development of the Ares and Space Shuttle heritage 5-segment solid rocket boosters (SRBs) for initial exploration missions in 2017 and 2021. The second phase in the booster acquisition plan is the Advanced Booster Risk Reduction and/or Engineering Demonstration NASA Research Announcement (NRA), which was recently awarded after a full and open competition. The NRA was released to industry on February 9, 2012, with a stated intent to reduce risks leading to an affordable advanced booster and to enable competition. The third and final phase will be a full and open competition for Design, Development, Test, and Evaluation (DDT&E) of the advanced boosters. There are no existing boosters that can meet the performance requirements for the 130 t class SLS. The expected thrust class of the advanced boosters is potentially double the current 5-segment solid rocket booster capability. These new boosters will enable the flexible path approach to space exploration beyond Earth orbit (BEO), opening up vast opportunities including near-Earth asteroids, Lagrange Points, and Mars. This evolved capability offers large volume for science missions and payloads, will be modular and flexible, and will be right-sized for mission requirements. NASA developed the Advanced Booster Engineering Demonstration and/or Risk Reduction NRA to seek industry participation in reducing risks leading to an affordable advanced booster that meets the SLS performance requirements

  8. Probabilistic human health risk assessment of degradation-related chemical mixtures in heterogeneous aquifers: Risk statistics, hot spots, and preferential channels

    NASA Astrophysics Data System (ADS)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2015-06-01

    The increasing presence of toxic chemicals released in the subsurface has led to a rapid growth of social concerns and the need to develop and employ models that can predict the impact of groundwater contamination on human health risk under uncertainty. Monitored natural attenuation is a common remediation action in many contamination cases. However, natural attenuation can lead to the production of daughter species of distinct toxicity that may pose challenges in pollution management strategies. The actual threat that these contaminants pose to human health depends on the interplay between the complex structure of the geological media and the toxicity of each pollutant byproduct. This work addresses human health risk for chemical mixtures resulting from the sequential degradation of a contaminant (such as a chlorinated solvent) under uncertainty through high-resolution three-dimensional numerical simulations. We systematically investigate the interaction between aquifer heterogeneity, flow connectivity, contaminant injection model, and chemical toxicity in the probabilistic characterization of health risk. We illustrate how chemical-specific travel times control the regime of the expected risk and its corresponding uncertainties. Results indicate conditions where preferential flow paths can favor the reduction of the overall risk of the chemical mixture. The overall human risk response to aquifer connectivity is shown to be nontrivial for multispecies transport. This nontriviality is a result of the interaction between aquifer heterogeneity and chemical toxicity. To quantify the joint effect of connectivity and toxicity in health risk, we propose a toxicity-based Damköhler number. Furthermore, we provide a statistical characterization in terms of low-order moments and the probability density function of the individual and total risks.

  9. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  10. A probabilistic asteroid impact risk model: assessment of sub-300 m impacts

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2017-06-01

    A comprehensive asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain input parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions for objects up to 300 m in diameter. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data have little effect on the metrics of interest.

  11. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  12. Using NASA Satellite Observations to Map Wildfire Risk in the United States for Allocation of Fire Management Resources

    NASA Astrophysics Data System (ADS)

    Farahmand, A.; Reager, J. T., II; Behrangi, A.; Stavros, E. N.; Randerson, J. T.

    2017-12-01

    Fires are a key disturbance globally acting as a catalyst for terrestrial ecosystem change and contributing significantly to both carbon emissions and changes in surface albedo. The socioeconomic impacts of wildfire activities are also significant with wildfire activity results in billions of dollars of losses every year. Fire size, area burned and frequency are increasing, thus the likelihood of fire danger, defined by United States National Interagency Fire Center (NFIC) as the demand of fire management resources as a function of how flammable fuels (a function of ignitability, consumability and availability) are from normal, is an important step toward reducing costs associated with wildfires. Numerous studies have aimed to predict the likelihood of fire danger, but few studies use remote sensing data to map fire danger at scales commensurate with regional management decisions (e.g., deployment of resources nationally throughout fire season with seasonal and monthly prediction). Here, we use NASA Gravity Recovery And Climate Experiment (GRACE) assimilated surface soil moisture, NASA Atmospheric Infrared Sounder (AIRS) vapor pressure deficit, NASA Moderate Resolution Imaging Spectroradiometer (MODIS) enhanced vegetation index products and landcover products, along with US Forest Service historical fire activity data to generate probabilistic monthly fire potential maps in the United States. These maps can be useful in not only government operational allocation of fire management resources, but also improving understanding of the Earth System and how it is changing in order to refine predictions of fire extremes.

  13. Application of Probabilistic Modeling to Quantify the Reduction Levels of Hepatocellular Carcinoma Risk Attributable to Chronic Aflatoxins Exposure.

    PubMed

    Wambui, Joseph M; Karuri, Edward G; Ojiambo, Julia A; Njage, Patrick M K

    2017-01-01

    Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 ± 514 and 158 ± 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 ± 518 cases per annum. Consumption and contamination levels contributed significantly to the risk whereby lower age groups were most affected. Nonetheless, pre- and postharvest interventions might reduce the risk by 23.0-83.4% and 4.8-95.1%, respectively. Therefore, chronic exposure to aflatoxins increases the HCC risk in rural Kenya, but a significant reduction of the risk can be achieved by applying specific pre- and postharvest interventions.

  14. Using probabilistic terrorism risk modeling for regulatory benefit-cost analysis: application to the Western hemisphere travel initiative in the land environment.

    PubMed

    Willis, Henry H; LaTourrette, Tom

    2008-04-01

    This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the

  15. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO

  16. Biophysics of NASA radiation quality factors.

    PubMed

    Cucinotta, Francis A

    2015-09-01

    NASA has implemented new radiation quality factors (QFs) for projecting cancer risks from space radiation exposures to astronauts. The NASA QFs are based on particle track structure concepts with parameters derived from available radiobiology data, and NASA introduces distinct QFs for solid cancer and leukaemia risk estimates. The NASA model was reviewed by the US National Research Council and approved for use by NASA for risk assessment for International Space Station missions and trade studies of future exploration missions to Mars and other destinations. A key feature of the NASA QFs is to represent the uncertainty in the QF assessments and evaluate the importance of the QF uncertainty to overall uncertainties in cancer risk projections. In this article, the biophysical basis for the probability distribution functions representing QF uncertainties was reviewed, and approaches needed to reduce uncertainties were discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Probabilistic eruption forecasting at short and long time scales

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  18. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  19. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state

    PubMed Central

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-01-01

    Background Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning — the “jumping to conclusion” bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). Methods We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. Results We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Limitations Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not pre-psychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. Conclusion As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk. PMID:25622039

  20. Reduced activation in the ventral striatum during probabilistic decision-making in patients in an at-risk mental state.

    PubMed

    Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Fenske, Sabrina; Schirmbeck, Frederike; Englisch, Susanne; Schilling, Claudia; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias

    2015-05-01

    Patients with schizophrenia display metacognitive impairments, such as hasty decision-making during probabilistic reasoning - the "jumping to conclusion" bias (JTC). Our recent fMRI study revealed reduced activations in the right ventral striatum (VS) and the ventral tegmental area (VTA) to be associated with decision-making in patients with schizophrenia. It is unclear whether these functional alterations occur in the at-risk mental state (ARMS). We administered the classical beads task and fMRI among ARMS patients and healthy controls matched for age, sex, education and premorbid verbal intelligence. None of the ARMS patients was treated with antipsychotics. Both tasks request probabilistic decisions after a variable amount of stimuli. We evaluated activation during decision-making under certainty versus uncertainty and the process of final decision-making. We included 24 AMRS patients and 24 controls in our study. Compared with controls, ARMS patients tended to draw fewer beads and showed significantly more JTC bias in the classical beads task, mirroring findings in patients with schizophrenia. During fMRI, ARMS patients did not demonstrate JTC bias on the behavioural level, but showed a significant hypoactivation in the right VS during the decision stage. Owing to the cross-sectional design of the study, results are constrained to a better insight into the neurobiology of risk constellations, but not prepsychotic stages. Nine of the ARMS patients were treated with antidepressants and/or lorazepam. As in patients with schizophrenia, a striatal hypoactivation was found in ARMS patients. Confounding effects of antipsychotic medication can be excluded. Our findings indicate that error prediction signalling and reward anticipation may be linked to striatal dysfunction during prodromal stages and should be examined for their utility in predicting transition risk.

  1. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  2. Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking

    NASA Technical Reports Server (NTRS)

    Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward

    2011-01-01

    To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk

  3. Overview of Graphical User Interface for ARRBOD (Acute Radiation Risk and BRYNTRN Organ Dose Projection)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2010-01-01

    Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts, because organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. Assessment of astronauts organ doses and ARS from the exposure to historically large SPEs is in support of mission design and operation planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI product, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.

  4. Probabilistic confidence for decisions based on uncertain reliability estimates

    NASA Astrophysics Data System (ADS)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  5. The Role of Probabilistic Design Analysis Methods in Safety and Affordability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2016-01-01

    For the last several years, NASA and its contractors have been working together to build space launch systems to commercialize space. Developing commercial affordable and safe launch systems becomes very important and requires a paradigm shift. This paradigm shift enforces the need for an integrated systems engineering environment where cost, safety, reliability, and performance need to be considered to optimize the launch system design. In such an environment, rule based and deterministic engineering design practices alone may not be sufficient to optimize margins and fault tolerance to reduce cost. As a result, introduction of Probabilistic Design Analysis (PDA) methods to support the current deterministic engineering design practices becomes a necessity to reduce cost without compromising reliability and safety. This paper discusses the importance of PDA methods in NASA's new commercial environment, their applications, and the key role they can play in designing reliable, safe, and affordable launch systems. More specifically, this paper discusses: 1) The involvement of NASA in PDA 2) Why PDA is needed 3) A PDA model structure 4) A PDA example application 5) PDA link to safety and affordability.

  6. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Astrophysics Data System (ADS)

    Godines, Cody R.; Manteufel, Randall D.

    2002-12-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation

  7. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Technical Reports Server (NTRS)

    Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)

    2002-01-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation

  8. NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.

    2014-01-01

    Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.

  9. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting

  10. Modeling the Risk of Fire/Explosion Due to Oxidizer/Fuel Leaks in the Ares I Interstage

    NASA Technical Reports Server (NTRS)

    Ring, Robert W.; Stott, James E.; Hales, Christy

    2008-01-01

    A significant flight hazard associated with liquid propellants, such as those used in the upper stage of NASA's new Ares I launch vehicle, is the possibility of leakage of hazardous fluids resulting in a catastrophic fire/explosion. The enclosed and vented interstage of the Ares I contains numerous oxidizer and fuel supply lines as well as ignition sources. The potential for fire/explosion due to leaks during ascent depends on the relative concentrations of hazardous and inert fluids within the interstage along with other variables such as pressure, temperature, leak rates, and fluid outgasing rates. This analysis improves on previous NASA Probabilistic Risk Assessment (PRA) estimates of the probability of deflagration, in which many of the variables pertinent to the problem were not explicitly modeled as a function of time. This paper presents the modeling methodology developed to analyze these risks.

  11. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  12. Probabilistic Human Health Risk Assessment of Chemical Mixtures: Hydro-Toxicological Interactions and Controlling Factors

    NASA Astrophysics Data System (ADS)

    Henri, C.; Fernandez-Garcia, D.; de Barros, F.

    2014-12-01

    Improper disposals of hazardous wastes in most industrial countries give rise to severe groundwater contamination problems that can lead to adverse health effects in humans. Therefore risk assessment methods play an important role in population protection by (1) quantifying the impact on human health of an aquifer contamination and (2) aiding the decision making process of to better manage our groundwater resources. Many reactive components such as chlorinated solvent or nitrate potentially experience attenuation processes under common geochemical conditions. Based on this, monitored natural attenuation has become nowadays an attractive remediation solution. However, in some cases, intermediate degradation products can constitute noxious chemical compounds before reaching a harmless chemical form. In these cases, the joint effect of advection-dispersion transport and the species-dependent kinetic reactions and toxicity will dictate the relative importance of the degradation byproducts to the total risk. This renders the interpretation of risk a non-trivial task. In this presentation, we quantify, through a probabilistic framework, the human health risk posed by a chemical mixture in a heterogeneous aquifer. This work focuses on a Perchloroethylene contamination problem followed by the first-order production/biodegradation of its daughter species Trichloroethylene, Dichloroethylene and Vinyl Chlorine that is known to be highly toxic. Uncertainty on the hydraulic conductivity field is considered through a Monte Carlo scheme. A comparative description of human health risk metrics as a function of aquifer heterogeneity and contaminant injection mode is provided by means of a spatial characterization of the lower-order statistical moments and empirical probability density functions of both individual and total risks. Interestingly, we show that the human health risk of a chemical mixture is mainly controlled by a modified Damköhler number that express the joint effect

  13. The benefits of probabilistic exposure assessment: three case studies involving contaminated air, water, and soil.

    PubMed

    Finley, B; Paustenbach, D

    1994-02-01

    Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.

  14. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that

  15. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  16. Guidelines for contingency planning NASA (National Aeronautics and Space Administration) ADP security risk reduction decision studies

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1984-01-01

    Guidance is presented to NASA Computer Security Officials for determining the acceptability or unacceptability of ADP security risks based on the technical, operational and economic feasibility of potential safeguards. The risk management process is reviewed as a specialized application of the systems approach to problem solving and information systems analysis and design. Reporting the results of the risk reduction analysis to management is considered. Report formats for the risk reduction study are provided.

  17. Latest NASA Instrument Cost Model (NICM): Version VI

    NASA Technical Reports Server (NTRS)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  18. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  19. Risk management. National Aeronautics and Space Administration (NASA). Interim rule adopted as final with changes.

    PubMed

    2000-11-22

    This is a final rule amending the NASA FAR Supplement (NFS) to emphasize considerations of risk management, including safety, security (including information technology security), health, export control, and damage to the environment, within the acquisition process. This final rule addresses risk management within the context of acquisition planning, selecting sources, choosing contract type, structuring award fee incentives, administering contracts, and conducting contractor surveillance.

  20. BBN-Based Portfolio Risk Assessment for NASA Technology R&D Outcome

    NASA Technical Reports Server (NTRS)

    Geuther, Steven C.; Shih, Ann T.

    2016-01-01

    The NASA Aeronautics Research Mission Directorate (ARMD) vision falls into six strategic thrusts that are aimed to support the challenges of the Next Generation Air Transportation System (NextGen). In order to achieve the goals of the ARMD vision, the Airspace Operations and Safety Program (AOSP) is committed to developing and delivering new technologies. To meet the dual challenges of constrained resources and timely technology delivery, program portfolio risk assessment is critical for communication and decision-making. This paper describes how Bayesian Belief Network (BBN) is applied to assess the probability of a technology meeting the expected outcome. The network takes into account the different risk factors of technology development and implementation phases. The use of BBNs allows for all technologies of projects in a program portfolio to be separately examined and compared. In addition, the technology interaction effects are modeled through the application of object-oriented BBNs. The paper discusses the development of simplified project risk BBNs and presents various risk results. The results presented include the probability of project risks not meeting success criteria, the risk drivers under uncertainty via sensitivity analysis, and what-if analysis. Finally, the paper shows how program portfolio risk can be assessed using risk results from BBNs of projects in the portfolio.

  1. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  2. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  3. NASA space cancer risk model-2014: Uncertainties due to qualitative differences in biological effects of HZE particles

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis

    Uncertainties in estimating health risks from exposures to galactic cosmic rays (GCR) — comprised of protons and high-energy and charge (HZE) nuclei are an important limitation to long duration space travel. HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation leading to large uncertainties in predicting risks to humans. Our NASA Space Cancer Risk Model-2012 (NSCR-2012) for estimating lifetime cancer risks from space radiation included several new features compared to earlier models from the National Council on Radiation Protection and Measurements (NCRP) used at NASA. New features of NSCR-2012 included the introduction of NASA defined radiation quality factors based on track structure concepts, a Bayesian analysis of the dose and dose-rate reduction effectiveness factor (DDREF) and its uncertainty, and the use of a never-smoker population to represent astronauts. However, NSCR-2012 did not include estimates of the role of qualitative differences between HZE particles and low LET radiation. In this report we discuss evidence for non-targeted effects increasing cancer risks at space relevant HZE particle absorbed doses in tissue (<0.2 Gy), and for increased tumor lethality due to the propensity for higher rates of metastatic tumors from high LET radiation suggested by animal experiments. The NSCR-2014 model considers how these qualitative differences modify the overall probability distribution functions (PDF) for cancer mortality risk estimates from space radiation. Predictions of NSCR-2014 for International Space Station missions and Mars exploration will be described, and compared to those of our earlier NSCR-2012 model.

  4. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  5. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  6. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  7. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments.

    PubMed

    Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A

    2017-11-06

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.

  8. Probabilistic health risk assessment for ingestion of seafood farmed in arsenic contaminated groundwater in Taiwan.

    PubMed

    Liang, Ching-Ping; Jang, Cheng-Shin; Chen, Jui-Sheng; Wang, Sheng-Wei; Lee, Jin-Jing; Liu, Chen-Wuing

    2013-08-01

    Seafood farmed in arsenic (As)-contaminated areas is a major exposure pathway for the ingestion of inorganic As by individuals in the southwestern part of Taiwan. This study presents a probabilistic risk assessment using limited data for inorganic As intake through the consumption of the seafood by local residents in these areas. The As content and the consumption rate are both treated as probability distributions, taking into account the variability of the amount in the seafood and individual consumption habits. The Monte Carlo simulation technique is utilized to conduct an assessment of exposure due to the daily intake of inorganic As from As-contaminated seafood. Exposure is evaluated according to the provisional tolerable weekly intake (PTWI) established by the FAO/WHO and the target risk based on the US Environmental Protection Agency guidelines. The assessment results show that inorganic As intake from five types of fish (excluding mullet) and shellfish fall below the PTWI threshold values for the 95th percentiles, but exceed the target cancer risk of 10(-6). The predicted 95th percentile for inorganic As intake and lifetime cancer risks obtained in the study are both markedly higher than those obtained in previous studies in which the consumption rate of seafood considered is a deterministic value. This study demonstrates the importance of the individual variability of seafood consumption when evaluating a high exposure sub-group of the population who eat higher amounts of fish and shellfish than the average Taiwanese.

  9. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  10. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  11. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  12. Regional probabilistic risk assessment of heavy metals in different environmental media and land uses: An urbanization-affected drinking water supply area

    NASA Astrophysics Data System (ADS)

    Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping

    2016-11-01

    In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.

  13. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  14. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  15. A Probabilistic System Analysis of Intelligent Propulsion System Technologies

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2007-01-01

    NASA s Intelligent Propulsion System Technology (Propulsion 21) project focuses on developing adaptive technologies that will enable commercial gas turbine engines to produce fewer emissions and less noise while increasing reliability. It features adaptive technologies that have included active tip-clearance control for turbine and compressor, active combustion control, turbine aero-thermal and flow control, and enabling technologies such as sensors which are reliable at high operating temperatures and are minimally intrusive. A probabilistic system analysis is performed to evaluate the impact of these technologies on aircraft CO2 (directly proportional to fuel burn) and LTO (landing and takeoff) NO(x) reductions. A 300-passenger aircraft, with two 396-kN thrust (85,000-pound) engines is chosen for the study. The results show that NASA s Intelligent Propulsion System technologies have the potential to significantly reduce the CO2 and NO(x) emissions. The results are used to support informed decisionmaking on the development of the intelligent propulsion system technology portfolio for CO2 and NO(x) reductions.

  16. NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert

    2011-01-01

    System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual

  17. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  18. Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.

    PubMed

    Wilhelm, C J; Mitchell, S H

    2008-10-01

    Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.

  19. Determination of the Risk of Radiation-Associated Circulatory and Cancer Disease Mortality in a NASA Early Astronaut Cohort

    NASA Technical Reports Server (NTRS)

    Elgart, S. R.; Chappell, L.; Milder, C. M.; Shavers, M. R.; Huff, J. L.; Little, M.; Patel, Z. S.

    2017-01-01

    Of the many possible health challenges posed during extended exploratory missions to space, the effects of space radiation on cardiovascular disease and cancer are of particular concern. There are unique challenges to estimating those radiation risks; care and appropriate and rigorous methodology should be applied when considering small cohorts such as the NASA astronaut population. The objective of this work was to determine if there was sufficient evidence for excess risk of cardiovascular disease and cancer in early NASA astronaut cohorts. NASA astronauts in selection groups 1-7 were chosen; this relatively homogeneous cohort consists of 73 white males, who unlike today's astronauts, maintained similar smoking and drinking habits to the general US population, and have published radiation doses. The participants flew in space on missions Mercury through Shuttle and received space radiation doses between 0-74.1 milligrays. Cause of death information was obtained from the Lifetime Surveillance of Astronaut Health (LSAH) program at NASA Johnson Space Center. Mortality was compared with the US male population. Trends of mortality with dose were assessed using a logistic model, fitted by maximum likelihood. Only 32 (43.84 percent) of the 73 early astronauts have died. Standard mortality ratios (SMRs) for cancer (n=7, SMR=43.4, 95 percent CI 17.8, 84.9), all circulatory disease (n=7, SMR=33.2, 95 percent CI 13.7, 65.0), and ischemic heart disease (IHD) (n=5, SMR=40.1, 95 percent CI 13.2, 89.4) were significantly lower than for the US white male population. For cerebrovascular disease, the upper confidence interval for SMR included 100, indicating it was not significantly different from the US population (n=2, SMR = 77.0, 95 percent CI 9.4, 268.2). The power of the study is low and remains below 10 percent even when risks 10 times those reported in the literature are assumed. Due to small sample size, there is currently insufficient statistical power to evaluate space

  20. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  1. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  2. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis

  3. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments

    PubMed Central

    Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.

    2018-01-01

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804

  4. Lossed in translation: an off-the-shelf method to recover probabilistic beliefs from loss-averse agents.

    PubMed

    Offerman, Theo; Palley, Asa B

    2016-01-01

    Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.

  5. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  6. Mars Sample Return: Do Australians trust NASA?

    NASA Astrophysics Data System (ADS)

    Joyce, S.; Tomkins, C. S.; Weinstein, P.

    2008-09-01

    Mars Sample Return (MSR) represents an important scientific goal in space exploration. Any sample return mission will be extremely challenging from a scientific, economic and technical standpoint. But equally testing, will be communicating with a public that may have a very different perception of the mission. A MSR mission will generate international publicity and it is vital that NASA acknowledge the nature and extent of public concern about the mission risks and, perhaps equally importantly, the public’s confidence in NASA’s ability to prepare for and manage these risks. This study investigated the level of trust in NASA in an Australian population sample, and whether this trust was dependent on demographic variables. Participants completed an online survey that explored their attitudes towards NASA and a MSR mission. The results suggested that people believe NASA will complete the mission successfully but have doubts as to whether NASA will be honest when communicating with the public. The most significant finding to emerge from this study was that confidence in NASA was significantly (p < 0.05) related to the respondent’s level of knowledge regarding the risks and benefits of MSR. These results have important implications for risk management and communication.

  7. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  8. Cost-Risk Trade-off of Solar Radiation Management and Mitigation under Probabilistic Information on Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann

    2017-04-01

    In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would

  9. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case

  10. Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India

    NASA Astrophysics Data System (ADS)

    Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth

    2016-04-01

    Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.

  11. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  12. Probabilistic Simulation for Combined Cycle Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  13. Collision Avoidance Short Course: Conjunction Assessment Risk Analysis - NASA Robotic CARA. Part I: ; Theory

    NASA Technical Reports Server (NTRS)

    Hejduk, M. D.; Frigm, Ryan C.

    2015-01-01

    Satellite conjunction assessment is perhaps the fastest growing area in space situational awareness and protection with military, civil and commercial satellite owner-operators embracing more and more sophisticated processes to avoid the avoidable - namely collisions between high value space assets and orbital debris. NASA and Centre National d'Etudes Spatiales (CNES) have collaborated to offer an introductory short course on all the major aspects of the conjunctions assessment problem. This half-day course will cover satellite conjunction dynamics and theory. Joint Space Operations Center (JsPOC) conjunction data products, major risk assessment parameters and plots, conjunction remediation decision support, and present and future challenges. This briefing represents the NASA portion of the course.

  14. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  15. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  16. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  17. NASA's Risk Management System

    NASA Technical Reports Server (NTRS)

    Perera, Jeevan S.

    2011-01-01

    Leadership is key to success. Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks -- risk office personnel. Each group is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk reporting and communication is an essential element of risk management and will combine both qualitative and quantitative elements. Risk informed decision making should be introduced to all levels of management. Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  18. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  19. The NASA Space Radiation Research Program

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2006-01-01

    We present a comprehensive overview of the NASA Space Radiation Research Program. This program combines basic research on the mechanisms of radiobiological action relevant for improving knowledge of the risks of cancer, central nervous system and other possible degenerative tissue effects, and acute radiation syndromes from space radiation. The keystones of the NASA Program are five NASA Specialized Center's of Research (NSCOR) investigating space radiation risks. Other research is carried out through peer-reviewed individual investigations and in collaboration with the US Department of Energies Low-Dose Research Program. The Space Radiation Research Program has established the Risk Assessment Project to integrate data from the NSCOR s and other peer-reviewed research into quantitative projection models with the goals of steering research into data and scientific breakthroughs that will reduce the uncertainties in current risk projections and developing the scientific knowledge needed for future individual risk assessment approaches and biological countermeasure assessments or design. The NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory was created by the Program to simulate space radiation on the ground in support of the above research programs. New results from NSRL will be described.

  20. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  1. NASA's Risk Management System

    NASA Technical Reports Server (NTRS)

    Perera, Jeevan S.

    2013-01-01

    Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks - not just risk office personnel. Each group/department is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk informed decision making should be introduced to all levels of management. ? Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  2. On the Measurement and Properties of Ambiguity in Probabilistic Expectations

    ERIC Educational Resources Information Center

    Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn

    2015-01-01

    Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…

  3. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Frigm, Ryan C.; Duncan, Matthew G.; Hejduk, Matthew D.

    2014-01-01

    Reacting to potential on-orbit collision risk in an operational environment requires timely and accurate communication and exchange of data, information, and analysis to ensure informed decision-making for safety of flight and responsible use of the shared space environment. To accomplish this mission, it is imperative that all stakeholders effectively manage resources: devoting necessary and potentially intensive resource commitment to responding to high-risk conjunction events and preventing unnecessary expenditure of resources on events of low collision risk. After 10 years of operational experience, the NASA Robotic Conjunction Assessment Risk Analysis (CARA) is modifying its Concept of Operations (CONOPS) to ensure this alignment of collision risk and resource management. This evolution manifests itself in the approach to characterizing, reporting, and refining of collision risk. Implementation of this updated CONOPS is expected to have a demonstrated improvement on the efficacy of JSpOC, CARA, and owner/operator resources.

  4. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Crocker, Andy; Greene, William D.

    2017-01-01

    Goals of NASA's Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to: (1) Reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS. (2) Enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. SLS Block 1 vehicle is being designed to carry 70 mT to LEO: (1) Uses two five-segment solid rocket boosters (SRBs) similar to the boosters that helped power the space shuttle to orbit. Evolved 130 mT payload class rocket requires an advanced booster with more thrust than any existing U.S. liquid-or solid-fueled boosters

  5. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Crocker, Andy; Graham, Bart

    2016-01-01

    Dynetics has designed innovative structure assemblies; manufactured them using Friction Stir Welding (FSW) to leverage NASA investments in tools, facilities, and processes; conducted proof and burst testing, demonstrating viability of design/build processes Dynetics/AR has applied state-of-the-art manufacturing and processing techniques to the heritage F-1, reducing risk for engine development Dynetics/AR has also made progress on technology demonstrations for ORSC cycle engine, which offers affordability and performance for both NASA and other launch vehicles Full-scale integrated oxidizer-rich test article. Testing will evaluate performance and combustion stability characteristics. Contributes to technology maturation for ox-rich staged combustion engines.

  6. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  7. NASA Collaborative Approach Mitigates Environmentally-Driven Obsolescence

    NASA Technical Reports Server (NTRS)

    Greene, Brian; Leeney, Bob; Richards, Joni

    2016-01-01

    National Aeronautics and Space Administration (NASA) missions, like Department of Defense (DoD) organizations, require the rigorous testing and qualification of critical materials. Obsolescence supply risks created by environmental requirements can affect the cost, schedule and performance of NASA missions and the resilience of critical infrastructure. The NASA Technology Evaluation for Environmental Risk Mitigation (TEERM) Principal Center helps to identify obsolescence supply risks driven by environmental requirements and works proactively with NASA Centers and Programs, the DoD, the European Space Agency (ESA) and other agencies and partners to identify and evaluate environmentally friendly alternatives. TEERM tracks environmental regulations, identifies the potential loss of material availability and works with NASA programs and Centers to evaluate potential impacts through a risk assessment approach. TEERM collaborative projects identify, demonstrate and evaluate commercially viable alternative technologies and materials. A major focus during the Space Shuttle Program was the need to replace ozone depleting substances that were used in spray foam and cleaning applications. The potential obsolescence of coatings containing hexavalent chromium and the risks associated with lead free solder were also of concern for the Space Shuttle and present ongoing risks to new programs such as the Space Launch System. One current project teams NASA and ESA in the evaluation and testing of individual coatings and coating systems as replacements for hexavalent chromium coatings in aerospace applications. The proactive, collaborative approach used by TEERM helps reduce the cost burden on any one team partner, reduces duplication of effort, and enhances the technical quality and overall applicability of the testing and analysis.

  8. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  9. NASA Space Radiation Protection Strategies: Risk Assessment and Permissible Exposure Limits

    NASA Technical Reports Server (NTRS)

    Huff, J. L.; Patel, Z. S.; Simonsen, L. C.

    2017-01-01

    Permissible exposure limits (PELs) for short-term and career astronaut exposures to space radiation have been set and approved by NASA with the goal of protecting astronauts against health risks associated with ionizing radiation exposure. Short term PELs are intended to prevent clinically significant deterministic health effects, including performance decrements, which could threaten astronaut health and jeopardize mission success. Career PELs are implemented to control late occurring health effects, including a 3% risk of exposure induced death (REID) from cancer, and dose limits are used to prevent cardiovascular and central nervous system diseases. For radiation protection, meeting the cancer PEL is currently the design driver for galactic cosmic ray and solar particle event shielding, mission duration, and crew certification (e.g., 1-year ISS missions). The risk of cancer development is the largest known long-term health consequence following radiation exposure, and current estimates for long-term health risks due to cardiovascular diseases are approximately 30% to 40% of the cancer risk for exposures above an estimated threshold (Deep Space one-year and Mars missions). Large uncertainties currently exist in estimating the health risks of space radiation exposure. Improved understanding through radiobiology and physics research allows increased accuracy in risk estimation and is essential for ensuring astronaut health as well as for controlling mission costs, optimization of mission operations, vehicle design, and countermeasure assessment. We will review the Space Radiation Program Element's research strategies to increase accuracy in risk models and to inform development and validation of the permissible exposure limits.

  10. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  11. [Theoretical evaluation of the risk of decompression illness during simulated extravehicular activity].

    PubMed

    Nikolaev, V P

    2008-01-01

    Theoretical analysis of the risk of decompression illness (DI) during extravehicular activity following the Russian and NASA decompression protocols (D-R and D-US, respectively) was performed. In contrast to the tradition approach to decompression stress evaluation by the factor of tissue supersaturation with nitrogen, our probabilistic theory of decompression safety provides a completely reasoned evaluation and comparison of the levels of hazard of these decompression protocols. According to this theory, the function of cumulative DI risk is equal to the sum of functions of cumulative risk of lesion of all body tissues by gas bubbles and their supersaturation by solute gases. Based on modeling of dynamics of these functions, growth of the DI cumulative risk in the course of D-R and D-US follows essentially similar trajectories within the time-frame of up to 330 minutes. However, further extension of D-US but not D-R raises the risk of DI drastically.

  12. Effects and risk assessment of linear alkylbenzene sulfonates in agricultural soil. 5. Probabilistic risk assessment of linear alkylbenzene sulfonates in sludge-amended soils.

    PubMed

    Jensen, J; Løkke, H; Holmstrup, M; Krogh, P H; Elsgaard, L

    2001-08-01

    Linear alkylbenzene sulfonates (LAS) can be found in high concentrations in sewage sludge and, hence, may enter the soil compartment as a result of sludge application. Here, LAS may pose a risk for soil-dwelling organisms. In the present probabilistic risk assessment, statistical extrapolation has been used to assess the risk of LAS to soil ecosystems. By use of a log-normal distribution model, the predicted no-effect concentration (PNEC) was estimated for soil fauna, plants, and a combination of these. Due to the heterogeneous endpoints for microorganisms, including functional as well as structural parameters, the use of sensitivity distributions is not considered to be applicable to this group of organisms, and a direct, expert evaluation of toxicity data was used instead. The soil concentration after sludge application was predicted for a number of scenarios and used as the predicted environmental concentration (PEC) in the risk characterization and calculation of risk quotients (RQ = PEC/PNEC). A LAS concentration of 4.6 mg/kg was used as the current best estimate of PNEC in all RQ calculations. Three levels of LAS contamination (530, 2,600, and 16,100 mg/kg), three half-lives (10, 25, and 40 d), and five different sludge loads (2, 4, 6, 8, and 10 t/ha) were included in the risk scenarios. In Denmark, the initial risk ratio would reach 1.5 in a realistic worst-case consideration. For countries not having similar sludge regulations, the estimated risk ratio may initially be considerably higher. However, even in the most extreme scenarios, the level of LAS is expected to be well beyond the estimated PNEC one year after application. The present risk assessment, therefore, concludes that LAS does not pose a significant risk to fauna, plants, and essential functions of agricultural soils as a result of normal sewage sludge amendment. However, risks have been identified in worst-case scenarios.

  13. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  14. Space Station: NASA's software development approach increases safety and cost risks. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.

  15. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  16. “Faster, better, and cheaper” at NASA: Lessons learned in managing and accepting risk

    NASA Astrophysics Data System (ADS)

    Paxton, Larry J.

    2007-11-01

    Can Earth observing missions be done "better, faster and cheaper"? In this paper I explore the management and technical issues that arose from the attempt to do things "faster, better and cheaper" at NASA. The FBC mantra lead to some failures and, more significantly, an increase in the cadence of missions. Mission cadence is a major enabler of innovation and the driver for the training and testing of the next generation of managers, engineers, and scientists. A high mission cadence is required to maintain and develop competence in mission design, management, and execution and, for an exploration-driven organization, to develop and train the next generation of leaders: the time between missions must be short enough that careers span the complete life of more than a few missions. This process reduces risk because the "lessons learned" are current and widely held. Increasing the cadence of missions has the added benefit of reducing the pressure to do everything on one particular mission thus reducing mission complexity. Since failures are inevitable in such a complex endeavor, a higher mission cadence has the advantage of providing some resiliency to the scientific program the missions support. Some failures are avoidable (often only in hindsight) but most are due to some combination of interacting factors. This interaction is often only appreciated as a potential failure mode after the fact. There is always the pressure to do more with less: the scope of the project may become too ambitious or the management and oversight of the project may be reduced to fit the money allocated, or the project time line may be lengthened due to external factors (launcher availability, budgetary constraints) without a concomitant increase in the total funding. This leads to increased risk. Risks are always deemed acceptable until they change from a "risk" to a "failure mode". Identifying and managing those risks are particularly difficult when the activities are dispersed

  17. Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem

    DOE PAGES

    Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...

    2015-01-01

    In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.

  18. Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.

    In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.

  19. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    NASA Astrophysics Data System (ADS)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  20. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  1. Probabilistic risk assessment of Chinese residents' exposure to fluoride in improved drinking water in endemic fluorosis areas.

    PubMed

    Zhang, Li E; Huang, Daizheng; Yang, Jie; Wei, Xiao; Qin, Jian; Ou, Songfeng; Zhang, Zhiyong; Zou, Yunfeng

    2017-03-01

    Studies have yet to evaluate the effects of water improvement on fluoride concentrations in drinking water and the corresponding health risks to Chinese residents in endemic fluorosis areas (EFAs) at a national level. This paper summarized available data in the published literature (2008-2016) on water fluoride from the EFAs in China before and after water quality was improved. Based on these obtained data, health risk assessment of Chinese residents' exposure to fluoride in improved drinking water was performed by means of a probabilistic approach. The uncertainties in the risk estimates were quantified using Monte Carlo simulation and sensitivity analysis. Our results showed that in general, the average fluoride levels (0.10-2.24 mg/L) in the improved drinking water in the EFAs of China were lower than the pre-intervention levels (0.30-15.24 mg/L). The highest fluoride levels were detected in North and Southwest China. The mean non-carcinogenic risks associated with consumption of the improved drinking water for Chinese residents were mostly accepted (hazard quotient < 1), but the non-carcinogenic risk of children in most of the EFAs at the 95th percentile exceeded the safe level of 1, indicating the potential non-cancer-causing health effects on this fluoride-exposed population. Sensitivity analyses indicated that fluoride concentration in drinking water, ingestion rate of water, and the exposure time in the shower were the most relevant variables in the model, therefore, efforts should focus mainly on the definition of their probability distributions for a more accurate risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Analysis of the Coupled Influence of Hydraulic Conductivity and Porosity Heterogeneity on Probabilistic Risk Analysis

    NASA Astrophysics Data System (ADS)

    Libera, A.; Henri, C.; de Barros, F.

    2017-12-01

    Heterogeneities in natural porous formations, mainly manifested through the hydraulic conductivity (K) and, to a lesser degree, the porosity (Φ), largely control subsurface flow and solute transport. The influence of the heterogeneous structure of K on flow and solute transport processes has been widely studied, whereas less attention is dedicated to the joint heterogeneity of conductivity and porosity fields. Our study employs computational tools to investigate the joint effect of the spatial variabilities of K and Φ on the transport behavior of a solute plume. We explore multiple scenarios, characterized by different levels of heterogeneity of the geological system, and compare the computational results from the joint K and Φ heterogeneous system with the results originating from the generally adopted constant porosity case. In our work, we assume that the heterogeneous porosity is positively correlated to hydraulic conductivity. We perform numerical Monte Carlo simulations of conservative and reactive contaminant transport in a 3D aquifer. Contaminant mass and plume arrival times at multiple control planes and/or pumping wells operating under different extraction rates are analyzed. We employ different probabilistic metrics to quantify the risk at the monitoring locations, e.g., increased lifetime cancer risk and exceedance of Maximum Contaminant Levels (MCLs), under multiple transport scenarios (i.e., different levels of heterogeneity, conservative or reactive solutes and different contaminant species). Results show that early and late arrival times of the solute mass at the selected sensitive locations (i.e. control planes/pumping wells) as well as risk metrics are strongly influenced by the spatial variability of the Φ field.

  3. Developing a Gap Taxonomy to Address Crew Health Risks in NASA's Human Research Program

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Edwards, J. Michelle

    2009-01-01

    The mission of NASA's Human Research Program (HRP) is to understand and reduce the risk to crew health and performance in exploration missions. The HRP addresses 27 specific risks by identifying and then filling gaps in understanding the risks and in the ability to disposition the risks. The primary bases for identifying gaps have been past experience and requirements definition. This approach has been very effective in identifying some important, relevant gaps, but may be inadequate for identifying gaps outside the past experience base. We are exploring the use of a gap taxonomy as a comprehensive, underlying conceptual framework that allows a more systematic identification of gaps. The taxonomy is based on these stages in medical care: prediction, prevention, detection/diagnosis, treatment, monitoring, rehabilitation, and lifetime surveillance. This gap taxonomy approach identifies new gaps in HRP health risks. Many of the new gaps suggest risk reduction approaches that are more cost effective than present approaches. A major benefit of the gap taxonomy approach is to identify new, economical approaches that reduce the likelihood and/or consequence of a risk.

  4. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  5. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  6. An Example of Risk Informed Design

    NASA Technical Reports Server (NTRS)

    Banke, Rick; Grant, Warren; Wilson, Paul

    2014-01-01

    NASA Engineering requested a Probabilistic Risk Assessment (PRA) to compare the difference in the risk of Loss of Crew (LOC) and Loss of Mission (LOM) between different designs of a fluid assembly. They were concerned that the configuration favored by the design team was more susceptible to leakage than a second proposed design, but realized that a quantitative analysis to compare the risks between the two designs might strengthen their argument. The analysis showed that while the second design did help improve the probability of LOC, it did not help from a probability of LOM perspective. This drove the analysis team to propose a minor design change that would drive the probability of LOM down considerably. The analysis also demonstrated that there was another major risk driver that was not immediately obvious from a typical engineering study of the design and was therefore unexpected. None of the proposed alternatives were addressing this risk. This type of trade study demonstrates the importance of performing a PRA in order to completely understand a system's design. It allows managers to use risk as another one of the commodities (e.g., mass, cost, schedule, fault tolerance) that can be traded early in the design of a new system.

  7. Probabilistic finite elements for fracture mechanics

    NASA Technical Reports Server (NTRS)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  8. Image Analysis via Fuzzy-Reasoning Approach: Prototype Applications at NASA

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steven J.

    2004-01-01

    A set of imaging techniques based on Fuzzy Reasoning (FR) approach was built for NASA at Kennedy Space Center (KSC) to perform complex real-time visual-related safety prototype tasks, such as detection and tracking of moving Foreign Objects Debris (FOD) during the NASA Space Shuttle liftoff and visual anomaly detection on slidewires used in the emergency egress system for Space Shuttle at the launch pad. The system has also proved its prospective in enhancing X-ray images used to screen hard-covered items leading to a better visualization. The system capability was used as well during the imaging analysis of the Space Shuttle Columbia accident. These FR-based imaging techniques include novel proprietary adaptive image segmentation, image edge extraction, and image enhancement. Probabilistic Neural Network (PNN) scheme available from NeuroShell(TM) Classifier and optimized via Genetic Algorithm (GA) was also used along with this set of novel imaging techniques to add powerful learning and image classification capabilities. Prototype applications built using these techniques have received NASA Space Awards, including a Board Action Award, and are currently being filed for patents by NASA; they are being offered for commercialization through the Research Triangle Institute (RTI), an internationally recognized corporation in scientific research and technology development. Companies from different fields, including security, medical, text digitalization, and aerospace, are currently in the process of licensing these technologies from NASA.

  9. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Derivations and Verification of Plans. Volume 1

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K, Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.

  10. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, Jason B.

    2014-01-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  11. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Roberts, J. B.

    2014-12-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  12. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  13. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  14. NASA Occupant Protection Standards Development

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles

    2012-01-01

    Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement

  15. A probabilistic QMRA of Salmonella in direct agricultural reuse of treated municipal wastewater.

    PubMed

    Amha, Yamrot M; Kumaraswamy, Rajkumari; Ahmad, Farrukh

    2015-01-01

    Developing reliable quantitative microbial risk assessment (QMRA) procedures aids in setting recommendations on reuse applications of treated wastewater. In this study, a probabilistic QMRA to determine the risk of Salmonella infections resulting from the consumption of edible crops irrigated with treated wastewater was conducted. Quantitative polymerase chain reaction (qPCR) was used to enumerate Salmonella spp. in post-disinfected samples, where they showed concentrations ranging from 90 to 1,600 cells/100 mL. The results were used to construct probabilistic exposure models for the raw consumption of three vegetables (lettuce, cabbage, and cucumber) irrigated with treated wastewater, and to estimate the disease burden using Monte Carlo analysis. The results showed elevated median disease burden, when compared with acceptable disease burden set by the World Health Organization, which is 10⁻⁶ disability-adjusted life years per person per year. Of the three vegetables considered, lettuce showed the highest risk of infection in all scenarios considered, while cucumber showed the lowest risk. The results of the Salmonella concentration obtained with qPCR were compared with the results of Escherichia coli concentration for samples taken on the same sampling dates.

  16. Determining the Relative Criticality of Diverse Exploration Risks in NASA's Human Research Program

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Edwards, J. Michelle; Anton, Wilma; Robotham, Kwesi

    2009-01-01

    The mission of NASA s Human Research Program (HRP) is to understand and reduce the risk to crew health and performance in exploration missions. The HRP addresses 27 specific risks, primarily in the context of Continuous Risk Management. Each risk is evaluated in terms of two missions (a six month stay on the Moon and a thirty month round trip to Mars) and three types of consequences (in-mission crew health, post-mission crew health, and in-mission performance). The lack of a common metric between the three consequence scales, such as financial costs or quality adjusted life years lost, makes it difficult to compare the relative criticality of the risks. We are, therefore, exploring the use of a ternary scale of criticality based on the common metric of influencing an operational decision. The three levels correspond to the level of concern the risk generates for a "go/no-go" decision to launch a mission: 1) no-go; 2) go with significant reservations; 3) go. The criticality of each of the 27 risks is scored for the three types of consequence in both types of mission. The scores are combined to produce an overall criticality rating for each risk. The overall criticality rating can then be used to guide the prioritization of resources to affect the greatest amount of risk reduction.

  17. Probabilistic risk assessment of cotton pyrethroids: I. Distributional analyses of laboratory aquatic toxicity data.

    PubMed

    Solomon, K R; Giddings, J M; Maund, S J

    2001-03-01

    This is the first in a series of five papers that assess the risk of the cotton pyrethroids in aquatic ecosystems in a series of steps ranging from the analysis of effects data through modeling exposures in the landscape. Pyrethroid insecticides used on cotton have the potential to contaminate aquatic systems. The objectives of this study were to develop probabilistic estimates of toxicity distributions, to compare these among the pyrethroids, and to evaluate cypermethrin as a representative pyrethroid for the purposes of a class risk assessment of the pyrethroids. The distribution of cypermethrin acute toxicity data gave 10th centile values of 10 ng/L for all organisms, 6.4 ng/L for arthropods, and 380 ng/L for vertebrates. For bifenthrin, cyfluthrin, lambda-cyhalothrin, and deltamethrin, the 10th centile values for all organisms were 15, 12, 10, and 9 ng/L, respectively, indicating similar or somewhat lower toxicity than cypermethrin. For tralomethrin and fenpropathrin, the 10th centiles were <310 and 240 ng/L, respectively. The distribution of permethrin toxicity to all organisms, arthropods, and vertebrates gave 10th centiles of 180, 76, and 1600 ng/L, respectively, whereas those for fenvalerate were 37, 8, and 150 ng/L. With the exception of tralomethrin, the distributions of acute toxicity values had similar slopes, suggesting that the variation of sensitivity in a range of aquatic nontarget species is similar. The pyrethroids have different recommended field rates of application that are related to their efficacy, and the relationship between field rate and 10th centiles showed a trend. These results support the use of cypermethrin as a reasonable worst-case surrogate for the other pyrethroids for the purposes of risk assessment of pyrethroids as a class.

  18. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability riskmore » metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.« less

  19. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  20. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  1. Probabilistic simple sticker systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  2. NASA Propulsion Concept Studies and Risk Reduction Activities for Resource Prospector Lander

    NASA Technical Reports Server (NTRS)

    Trinh, Huu P.; Williams, Hunter; Burnside, Chris

    2015-01-01

    The trade study has led to the selection of propulsion concept with the lowest cost and net lowest risk -Government-owned, flight qualified components -Meet mission requirements although the configuration is not optimized. Risk reduction activities have provided an opportunity -Implement design improvements while development with the early-test approach. -Gain knowledge on the operation and identify operation limit -Data to anchor analytical models for future flight designs; The propulsion system cold flow tests series have provided valuable data for future design. -The pressure surge from the system priming and waterhammer within component operation limits. -Enable to optimize the ullage volume to reduce the propellant tank mass; RS-34 hot fire tests have successfully demonstrated of using the engines for the RP mission -No degradation of performance due to extended storage life of the hardware. -Enable to operate the engine for RP flight mission scenarios, outside of the qualification regime. -Provide extended data for the thermal and GNC designs. Significant progress has been made on NASA propulsion concept design and risk reductions for Resource Prospector lander.

  3. Probabilistic Risk Assessment Process for High-Power Laser Operations in Outdoor Environments

    DTIC Science & Technology

    2016-01-01

    avionics data bus. In the case of a UAS-mounted laser system, the control path will additionally include a radio or satellite communications link. A remote...JBSA Fort Sam Houston, TX 78234 711 HPW/RHDO 11 . SPONSOR’S/MONITOR’S REPORT NUMBER(S) AFRL-RH-FS-JA-2015...hazard assessment pur- poses is not widespread within the laser safety community . The aim of this paper is to outline the basis of the probabilistic

  4. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  5. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  6. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  7. A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.

  8. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  9. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  10. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  11. Importance Of Quality Control in Reducing System Risk, a Lesson Learned From The Shuttle and a Recommendation for Future Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Messer, Bradley P.

    2006-01-01

    This paper presents lessons learned from the Space Shuttle return to flight experience and the importance of these lessons learned in the development of new the NASA Crew Launch Vehicle (CLV). Specifically, the paper discusses the relationship between process control and system risk, and the importance of process control in improving space vehicle flight safety. It uses the External Tank (ET) Thermal Protection System (TPS) experience and lessons learned from the redesign and process enhancement activities performed in preparation for Return to Flight after the Columbia accident. The paper also, discusses in some details, the Probabilistic engineering physics based risk assessment performed by the Shuttle program to evaluate the impact of TPS failure on system risk and the application of the methodology to the CLV.

  12. A probabilistic risk assessment for deployed military personnel after the implementation of the "Leishmaniasis Control Program" at Tallil Air Base, Iraq.

    PubMed

    Schleier, Jerome J; Davis, Ryan S; Barber, Loren M; Macedo, Paula A; Peterson, Robert K D

    2009-05-01

    Leishmaniasis has been of concern to the U.S. military and has re-emerged in importance because of recent deployments to the Middle East. We conducted a retrospective probabilistic risk assessment for military personnel potentially exposed to insecticides during the "Leishmaniasis Control Plan" (LCP) undertaken in 2003 at Tallil Air Base, Iraq. We estimated acute and subchronic risks from resmethrin, malathion, piperonyl butoxide (PBO), and pyrethrins applied using a truck-mounted ultra-low-volume (ULV) sprayer and lambda-cyhalothrin, cyfluthrin, bifenthrin, chlorpyrifos, and cypermethrin used for residual sprays. We used the risk quotient (RQ) method for our risk assessment (estimated environmental exposure/toxic endpoint) and set the RQ level of concern (LOC) at 1.0. Acute RQs for truck-mounted ULV and residual sprays ranged from 0.00007 to 33.3 at the 95th percentile. Acute exposure to lambda-cyhalothrin, bifenthrin, and chlorpyrifos exceeded the RQ LOC. Subchronic RQs for truck-mounted ULV and residual sprays ranged from 0.00008 to 32.8 at the 95th percentile. Subchronic exposures to lambda-cyhalothrin and chlorpyrifos exceeded the LOC. However, estimated exposures to lambda-cyhalothrin, bifenthrin, and chlorpyrifos did not exceed their respective no observed adverse effect levels.

  13. Acute Radiation Risk and BRYNTRN Organ Dose Projection Graphical User Interface

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Hu, Shaowen; Nounu, Hateni N.; Kim, Myung-Hee

    2011-01-01

    The integration of human space applications risk projection models of organ dose and acute radiation risk has been a key problem. NASA has developed an organ dose projection model using the BRYNTRN with SUM DOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUM DOSE are a Baryon transport code and an output data processing code, respectively. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN. A GUI for the ARR and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. BRYNTRN code operation requires extensive input preparation. Only a graphical user interface (GUI) can handle input and output for BRYNTRN to the response models easily and correctly. The purpose of the GUI development for ARRBOD is to provide seamless integration of input and output manipulations for the operations of projection modules (BRYNTRN, SLMDOSE, and the ARR probabilistic response model) in assessing the acute risk and the organ doses of significant Solar Particle Events (SPEs). The assessment of astronauts radiation risk from SPE is in support of mission design and operational planning to manage radiation risks in future space missions. The ARRBOD GUI can identify the proper shielding solutions using the gender-specific organ dose assessments in order to avoid ARR symptoms, and to stay within the current NASA short-term dose limits. The quantified evaluation of ARR severities based on any given shielding configuration and a specified EVA or other mission

  14. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  15. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  16. NASA Technology Evaluation for Environmental Risk Mitigation Remediation Technology Collaboration Development

    NASA Technical Reports Server (NTRS)

    Romeo, James

    2013-01-01

    NASA is committed to finding solutions to agency cleanup problems that are better, cheaper, and more effective than the status quo. Unfortunately, some potential solutions involve innovative technologies for which NASA remediation managers may not have a high level of understanding or confidence. Since 2004, NASA's Stennis Space Center (SSC) in Mississippi has been pumping groundwater contaminated with trichloroethylene (TCE) and other halogenated volatile organic compounds (HVOC) from their cleanup location designated "Area G" through extraction wells to an aboveground treatment system. Over time, however, the effectiveness of this treatment strategy has diminished and an alternative approach is needed. In 2012, professionals from NASA's Principal Center for Technology Evaluation for Environmental Risk Mitigation (TEERM) introduced SSC managers to an innovative technology for enhancing the performance of SSC's existing pump and treat system. The technology, generally referred to as in situ chemical oxidation (ISCO), involves slowly and continuously injecting a strong but safe chemical oxidant into the groundwater. Treatment is enhanced by a "surfactant-type effect" which causes residual contamination from saturated soil to be released into the dissolved-phase where it can be readily oxidized. Any dissolved-phase contamination that was not oxidized can be collected by the extraction well network and treated aboveground. SSC was not familiar with the technology so to increase their confidence, TEERM identified a contractor who was willing to demonstrate their product and process at a significantly reduced price. An initial, small-scale demonstration of ISCO began at sse in March 2012 and completed in August 2012. This successful demonstration was followed by three larger-scale ISCO demonstrations between August and December 2012. The contractor's innovative Continuous Injection System (CIS) incorporated "green" and sustainable technologies and practices. A slow

  17. On the Accuracy of Probabilistic Bucking Load Prediction

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.

    2001-01-01

    The buckling strength of thin-walled stiffened or unstiffened, metallic or composite shells is of major concern in aeronautical and space applications. The difficulty to predict the behavior of axially compressed thin-walled cylindrical shells continues to worry design engineers as we enter the third millennium. Thanks to extensive research programs in the late sixties and early seventies and the contributions of many eminent scientists, it is known that buckling strength calculations are affected by the uncertainties in the definition of the parameters of the problem such as definition of loads, material properties, geometric variables, edge support conditions, and the accuracy of the engineering models and analysis tools used in the design phase. The NASA design criteria monographs from the late sixties account for these design uncertainties by the use of a lump sum safety factor. This so-called 'empirical knockdown factor gamma' usually results in overly conservative design. Recently new reliability based probabilistic design procedure for buckling critical imperfect shells have been proposed. It essentially consists of a stochastic approach which introduces an improved 'scientific knockdown factor lambda(sub a)', that is not as conservative as the traditional empirical one. In order to incorporate probabilistic methods into a High Fidelity Analysis Approach one must be able to assess the accuracy of the various steps that must be executed to complete a reliability calculation. In the present paper the effect of size of the experimental input sample on the predicted value of the scientific knockdown factor lambda(sub a) calculated by the First-Order, Second-Moment Method is investigated.

  18. Probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  19. Space Debris Modeling at NASA

    NASA Technical Reports Server (NTRS)

    Johnson, Nicholas L.

    2001-01-01

    Since the Second European Conference on Space Debris in 1997, the Orbital Debris Program Office at the NASA Johnson Space Center has undertaken a major effort to update and improve the principal software tools employed to model the space debris environment and to evaluate mission risks. NASA's orbital debris engineering model, ORDEM, represents the current and near-term Earth orbital debris population from the largest spacecraft to the smallest debris in a manner which permits spacecraft engineers and experimenters to estimate the frequency and velocity with which a satellite may be struck by debris of different sizes. Using expanded databases and a new program design, ORDEM2000 provides a more accurate environment definition combined with a much broader array of output products in comparison with its predecessor, ORDEM96. Studies of the potential long-term space debris environment are now conducted with EVOLVE 4.0, which incorporates significant advances in debris characterization and breakup modeling. An adjunct to EVOLVE 4.0, GEO EVOLVE has been created to examine debris issues near the geosynchronous orbital regime. In support of NASA Safety Standard 1740.14, which establishes debris mitigation guidelines for all NASA space programs, a set of evaluation tools called the Debris Assessment Software (DAS) is specifically designed for program offices to determine whether they are in compliance with NASA debris mitigation guidelines. DAS 1.5 has recently been released with improved WINDOWS compatibility and graphics functions. DAS 2.0 will incorporate guideline changes in a forthcoming revision to NASA Safety Standard 1740.14. Whereas DAS contains a simplified model to calculate possible risks associated with satellite reentries, NASA's higher fidelity Object Reentry Survival Analysis Tool (ORSAT) has been upgraded to Version 5.0. With the growing awareness of the potential risks posed by uncontrolled satellite reentries to people and property on Earth, the

  20. Occupant Protection at NASA

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey; Granderson, Brad; Scheuring, Rick

    2010-01-01

    This slide presentation reviews NASA's efforts to arrive at protection of occupants of the ORION space craft on landing. An Abbreviated Injury Scale (AIS) has been developed, it is an anatomically-based, consensus-derived, global severity scoring system that classifies each injury by body region according to its relative importance on a 6-point ordinal scale. It reviews an Operationmally Relevant Injury Scale (ORIS), a classification methodology, and shows charts that detail the results of applying this ORIS to the injury databases. One chart uses NASCAR injury classification. It discusses providing a context for the level of risk inherent in the Orion landings in terms that people understand and have a sense for. For example is the risk of injury during an Orion landing roughly the same, better or worse than: An aircraft carrier landing, a NASCAR crash, or a helicopter crash, etc? The data for NASCAR and Indy Racing league (IRL) racing crash and injury data was reviewed. The risk from the Air Force, Navy, and Army injury data was also reviewed. Past NASA and the Soyuz programs injury risks are also reviewed. The work is an attempt to formulate a recommendation to the Orion Project for an acceptable level of injury risk associated with Nominal and Off-Nominal landing cases. The presentation also discusses the data mining and use of the data to Validate NASA Operationally-Relevant Injury Scale (NORIS) / Military Operationally-Relevant Injury Scale (MORIS), developing injury risk criteria, the types of data that are required, NASCAR modeling techniques and crash data, and comparison with the Brinkley model. The development of injury risk curves for each biodynamic response parameter is discussed. One of the main outcomes of this work is to establish an accurate Automated Test Dummy (ATD) that can be used to measure human tolerances.

  1. Update on NASA Microelectronics Activities

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Sampson, Michael J.; Casey, Megan; Lauenstein, Jean-Marie

    2017-01-01

    Mission Statement: The NASA Electronic Parts and Packaging (NEPP) Program provides NASA's leadership for developing and maintaining guidance for the screening, qualification, test. and usage of EEE parts by NASA as well as in collaboration with other government Agencies and industry. NASA Space Technology Mission Directorate (STMD) "STMD rapidly develops, demonstrates, and infuses revolutionary, high-payoff technologies through transparent, collaborative partnerships, expanding the boundaries of the aerospace enterprise." Mission Statement: The Space Environments Testing Management Office (SETMO) will identify, prioritize, and manage a select suite of Agency key capabilities/assets that are deemed to be essential to the future needs of NASA or the nation, including some capabilities that lack an adequate business base over the budget horizon. NESC mission is to perform value-added independent testing, analysis, and assessments of NASA's high-risk projects to ensure safety and mission success. NASA Space Environments and Avionics Fellows as well as Radiation and EEE Parts Community of Practice (CoP) leads.

  2. Overview of the NASA Subsonic Rotary Wing Aeronautics Research Program in Rotorcraft Crashworthiness

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Kellas, Sotiris; Fuchs, Yvonne T.

    2009-01-01

    This paper provides an overview of rotorcraft crashworthiness research being conducted at NASA Langley Research Center under sponsorship of the Subsonic Rotary Wing (SRW) Aeronautics Program. The research is focused in two areas: development of an externally deployable energy attenuating concept and improved prediction of rotorcraft crashworthiness. The deployable energy absorber (DEA) is a composite honeycomb structure, with a unique flexible hinge design that allows the honeycomb to be packaged and remain flat until needed for deployment. The capabilities of the DEA have been demonstrated through component crush tests and vertical drop tests of a retrofitted fuselage section onto different surfaces or terrain. The research on improved prediction of rotorcraft crashworthiness is focused in several areas including simulating occupant responses and injury risk assessment, predicting multi-terrain impact, and utilizing probabilistic analysis methods. A final task is to perform a system-integrated simulation of a full-scale helicopter crash test onto a rigid surface. A brief description of each research task is provided along with a summary of recent accomplishments.

  3. Overview of the NASA Subsonic Rotary Wing Aeronautics Research Program in Rotorcraft Crashworthiness

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fuchs, Yvonne T.; Kellas, Sotiris

    2008-01-01

    This paper provides an overview of rotorcraft crashworthiness research being conducted at NASA Langley Research Center under sponsorship of the Subsonic Rotary Wing (SRW) Aeronautics Program. The research is focused in two areas: development of an externally deployable energy attenuating concept and improved prediction of rotorcraft crashworthiness. The deployable energy absorber (DEA) is a composite honeycomb structure, with a unique flexible hinge design that allows the honeycomb to be packaged and remain flat until needed for deployment. The capabilities of the DEA have been demonstrated through component crush tests and vertical drop tests of a retrofitted fuselage section onto different surfaces or terrain. The research on improved prediction of rotorcraft crashworthiness is focused in several areas including simulating occupant responses and injury risk assessment, predicting multi-terrain impact, and utilizing probabilistic analysis methods. A final task is to perform a system-integrated simulation of a full-scale helicopter crash test onto a rigid surface. A brief description of each research task is provided along with a summary of recent accomplishments.

  4. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  5. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  6. Fission Power System Technology for NASA Exploration Missions

    NASA Technical Reports Server (NTRS)

    Mason, Lee; Houts, Michael

    2011-01-01

    Under the NASA Exploration Technology Development Program, and in partnership with the Department of Energy (DOE), NASA is conducting a project to mature Fission Power System (FPS) technology. A primary project goal is to develop viable system options to support future NASA mission needs for nuclear power. The main FPS project objectives are as follows: 1) Develop FPS concepts that meet expected NASA mission power requirements at reasonable cost with added benefits over other options. 2) Establish a hardware-based technical foundation for FPS design concepts and reduce overall development risk. 3) Reduce the cost uncertainties for FPS and establish greater credibility for flight system cost estimates. 4) Generate the key products to allow NASA decisionmakers to consider FPS as a preferred option for flight development. In order to achieve these goals, the FPS project has two main thrusts: concept definition and risk reduction. Under concept definition, NASA and DOE are performing trade studies, defining requirements, developing analytical tools, and formulating system concepts. A typical FPS consists of the reactor, shield, power conversion, heat rejection, and power management and distribution (PMAD). Studies are performed to identify the desired design parameters for each subsystem that allow the system to meet the requirements with reasonable cost and development risk. Risk reduction provides the means to evaluate technologies in a laboratory test environment. Non-nuclear hardware prototypes are built and tested to verify performance expectations, gain operating experience, and resolve design uncertainties.

  7. Application of probabilistic risk assessment: Evaluating remedial alternatives at the Portland Harbor Superfund Site, Portland, Oregon, USA.

    PubMed

    Ruffle, Betsy; Henderson, James; Murphy-Hagan, Clare; Kirkwood, Gemma; Wolf, Frederick; Edwards, Deborah A

    2018-01-01

    A probabilistic risk assessment (PRA) was performed to evaluate the range of potential baseline and postremedy health risks to fish consumers at the Portland Harbor Superfund Site (the "Site"). The analysis focused on risks of consuming fish resident to the Site containing polychlorinated biphenyls (PCBs), given that this exposure scenario and contaminant are the primary basis for US Environmental Protection Agency's (USEPA's) selected remedy per the January 2017 Record of Decision (ROD). The PRA used probability distributions fit to the same data sets used in the deterministic baseline human health risk assessment (BHHRA) as well as recent sediment and fish tissue data to evaluate the range and likelihood of current baseline cancer risks and noncancer hazards for anglers. Areas of elevated PCBs in sediment were identified on the basis of a geospatial evaluation of the surface sediment data, and the ranges of risks and hazards associated with pre- and postremedy conditions were calculated. The analysis showed that less active remediation (targeted to areas with the highest concentrations) compared to the remedial alternative selected by USEPA in the ROD can achieve USEPA's interim risk management benchmarks (cancer risk of 10 -4 and noncancer hazard index [HI] of 10) immediately postremediation for the vast majority of subsistence anglers that consume smallmouth bass (SMB) fillet tissue. In addition, the same targeted remedy achieves USEPA's long-term benchmarks (10 -5 and HI of 1) for the majority of recreational anglers. Additional sediment remediation would result in negligible additional risk reduction due to the influence of background. The PRA approach applied here provides a simple but adaptive framework for analysis of risks and remedial options focused on variability in exposures. It can be updated and refined with new data to evaluate and reduce uncertainty, improve understanding of the Site and target populations, and foster informed remedial decision

  8. Asteroid-Generated Tsunami and Impact Risk

    NASA Astrophysics Data System (ADS)

    Boslough, M.; Aftosmis, M.; Berger, M. J.; Ezzedine, S. M.; Gisler, G.; Jennings, B.; LeVeque, R. J.; Mathias, D.; McCoy, C.; Robertson, D.; Titov, V. V.; Wheeler, L.

    2016-12-01

    The justification for planetary defense comes from a cost/benefit analysis, which includes risk assessment. The contribution from ocean impacts and airbursts is difficult to quantify and represents a significant uncertainty in our assessment of the overall risk. Our group is currently working toward improved understanding of impact scenarios that can generate dangerous tsunami. The importance of asteroid-generated tsunami research has increased because a new Science Definition Team, at the behest of NASA's Planetary Defense Coordinating Office, is now updating the results of a 2003 study on which our current planetary defense policy is based Our group was formed to address this question on many fronts, including asteroid entry modeling, tsunami generation and propagation simulations, modeling of coastal run-ups, inundation, and consequences, infrastructure damage estimates, and physics-based probabilistic impact risk assessment. We also organized the Second International Workshop on Asteroid Threat Assessment, focused on asteroid-generated tsunami and associated risk (Aug. 23-24, 2016). We will summarize our progress and present the highlights of our workshop, emphasizing its relevance to earth and planetary science. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.

  9. Probabilistic models of cognition: conceptual foundations.

    PubMed

    Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan

    2006-07-01

    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.

  10. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  11. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  12. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  13. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  14. An Overview of NASA's Risk of Cardiovascular Disease from Radiation Exposure

    NASA Technical Reports Server (NTRS)

    Patel, Zarana S.; Huff, Janice L.; Simonsen, Lisa C.

    2015-01-01

    The association between high doses of radiation exposure and cardiovascular damage is well established. Patients that have undergone radiotherapy for primary cancers of the head and neck and mediastinal regions have shown increased risk of heart and vascular damage and long-term development of radiation-induced heart disease [1]. In addition, recent meta-analyses of epidemiological data from atomic bomb survivors and nuclear industry workers has also shown that acute and chronic radiation exposures is strongly correlated with an increased risk of circulatory disease at doses above 0.5 Sv [2]. However, these analyses are confounded for lower doses by lifestyle factors, such as drinking, smoking, and obesity. The types of radiation found in the space environment are significantly more damaging than those found on Earth and include galactic cosmic radiation (GCR), solar particle events (SPEs), and trapped protons and electrons. In addition to the low-LET data, only a few studies have examined the effects of heavy ion radiation on atherosclerosis, and at lower, space-relevant doses, the association between exposure and cardiovascular pathology is more varied and unclear. Understanding the qualitative differences in biological responses produced by GCR compared to Earth-based radiation is a major focus of space radiation research and is imperative for accurate risk assessment for long duration space missions. Other knowledge gaps for the risk of radiation-induced cardiovascular disease include the existence of a dose threshold, low dose rate effects, and potential synergies with other spaceflight stressors. The Space Radiation Program Element within NASA's Human Research Program (HRP) is managing the research and risk mitigation strategies for these knowledge gaps. In this presentation, we will review the evidence and present an overview of the HRP Risk of Cardiovascular Disease and Other Degenerative Tissue Effects from Radiation Exposure.

  15. Grid Inertial Response-Based Probabilistic Determination of Energy Storage System Capacity Under High Solar Penetration

    DOE PAGES

    Yue, Meng; Wang, Xiaoyu

    2015-07-01

    It is well-known that responsive battery energy storage systems (BESSs) are an effective means to improve the grid inertial response to various disturbances including the variability of the renewable generation. One of the major issues associated with its implementation is the difficulty in determining the required BESS capacity mainly due to the large amount of inherent uncertainties that cannot be accounted for deterministically. In this study, a probabilistic approach is proposed to properly size the BESS from the perspective of the system inertial response, as an application of probabilistic risk assessment (PRA). The proposed approach enables a risk-informed decision-making processmore » regarding (1) the acceptable level of solar penetration in a given system and (2) the desired BESS capacity (and minimum cost) to achieve an acceptable grid inertial response with a certain confidence level.« less

  16. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  17. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  18. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  19. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  20. Probabilistic Risk Assessment of Hydraulic Fracturing in Unconventional Reservoirs by Means of Fault Tree Analysis: An Initial Discussion

    NASA Astrophysics Data System (ADS)

    Rodak, C. M.; McHugh, R.; Wei, X.

    2016-12-01

    The development and combination of horizontal drilling and hydraulic fracturing has unlocked unconventional hydrocarbon reserves around the globe. These advances have triggered a number of concerns regarding aquifer contamination and over-exploitation, leading to scientific studies investigating potential risks posed by directional hydraulic fracturing activities. These studies, balanced with potential economic benefits of energy production, are a crucial source of information for communities considering the development of unconventional reservoirs. However, probabilistic quantification of the overall risk posed by hydraulic fracturing at the system level are rare. Here we present the concept of fault tree analysis to determine the overall probability of groundwater contamination or over-exploitation, broadly referred to as the probability of failure. The potential utility of fault tree analysis for the quantification and communication of risks is approached with a general application. However, the fault tree design is robust and can handle various combinations of regional-specific data pertaining to relevant spatial scales, geological conditions, and industry practices where available. All available data are grouped into quantity and quality-based impacts and sub-divided based on the stage of the hydraulic fracturing process in which the data is relevant as described by the USEPA. Each stage is broken down into the unique basic events required for failure; for example, to quantify the risk of an on-site spill we must consider the likelihood, magnitude, composition, and subsurface transport of the spill. The structure of the fault tree described above can be used to render a highly complex system of variables into a straightforward equation for risk calculation based on Boolean logic. This project shows the utility of fault tree analysis for the visual communication of the potential risks of hydraulic fracturing activities on groundwater resources.

  1. NASA/CARES dual-use ceramic technology spinoff applications

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1994-01-01

    NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.

  2. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  3. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  4. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  5. Global/local methods for probabilistic structural analysis

    NASA Astrophysics Data System (ADS)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  6. Limiting Future Collision Risk to Spacecraft: An Assessment of NASA's Meteoroid and Orbital Debris Programs

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Over the past 50 years, various NASA communities have contributed significantly to maturing NASA s meteoroid and orbital debris (MMOD)1 programs to their current state. As a result of these community efforts, and to NASA s credit, NASA s MMOD programs and models are now widely used and respected by the providers and users of both government and commercial satellites, nationally as well as internationally. Satellites have been redesigned to protect critical components from MMOD damage by moving critical components from exterior surfaces to deep inside a satellite s structure. Orbits are monitored and altered to minimize the risk of collision with tracked orbital debris. MMOD shielding added to the International Space Station (ISS) protects critical components and astronauts from potentially catastrophic damage that might result from smaller, untracked debris and meteoroid impacts. The space shuttle, as it orbited Earth, and whether docked to the ISS or not, was optimally oriented to protect its fragile thermal protection and thermal radiation systems from MMOD damage. In addition, astronauts inspected its thermal protection system for MMOD damage before the shuttle reentered Earth s atmosphere; Orion, NASA s capsule to carry astronauts to low Earth orbit, includes designs to mitigate the threat of MMOD damage and provide increased safety to the crew. When a handful of reasonable assumptions are used in NASA s MMOD models, scenarios are uncovered that conclude that the current orbital debris environment has already reached a "tipping point." That is, the amount of debris - in terms of the population of large debris objects, as well as overall mass of debris in orbit - currently in orbit has reached a threshold where it will continually collide with itself, further increasing the population of orbital debris. This increase will lead to corresponding increases in spacecraft failures, which will only create more feedback into the system, increasing the debris population

  7. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  8. Dream project: Applications of earth observations to disaster risk management

    NASA Astrophysics Data System (ADS)

    Dyke, G.; Gill, S.; Davies, R.; Betorz, F.; Andalsvik, Y.; Cackler, J.; Dos Santos, W.; Dunlop, K.; Ferreira, I.; Kebe, F.; Lamboglia, E.; Matsubara, Y.; Nikolaidis, V.; Ostoja-Starzewski, S.; Sakita, M.; Verstappen, N.

    2011-01-01

    The field of disaster risk management is relatively new and takes a structured approach to managing uncertainty related to the threat of natural and man-made disasters. Disaster risk management consists primarily of risk assessment and the development of strategies to mitigate disaster risk. This paper will discuss how increasing both Earth observation data and information technology capabilities can contribute to disaster risk management, particularly in Belize. The paper presents the results and recommendations of a project conducted by an international and interdisciplinary team of experts at the 2009 session of the International Space University in NASA Ames Research Center (California, USA). The aim is to explore the combination of current, planned and potential space-aided, airborne, and ground-based Earth observation tools, the emergence of powerful new web-based and mobile data management tools, and how this combination can support and improve the emerging field of disaster risk management. The starting point of the project was the World Bank's Comprehensive Approach to Probabilistic Risk Assessment (CAPRA) program, focused in Central America. This program was used as a test bed to analyze current space technologies used in risk management and develop new strategies and tools to be applied in other regions around the world.

  9. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  10. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  11. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  12. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  13. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  14. The NASA Electronic Parts and Packaging (NEPP) Program: An Overview

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Sampson, Michael J.

    2016-01-01

    This presentation provides an overview of the NEPP Program. The NEPP Mission is to provide guidance to NASA for the selection and application of microelectronics technologies; Improve understanding of the risks related to the use of these technologies in the space environment; Ensure that appropriate research is performed to meet NASA mission assurance needs. NEPP's Goals are to provide customers with appropriate and cost-effective risk knowledge to aid in: Selection and application of microelectronics technologies; Improved understanding of risks related to the use of these technologies in the space environment; Appropriate evaluations to meet NASA mission assurance needs; Guidelines for test and application of parts technologies in space; Assurance infrastructure and support for technologies in use by NASA space systems.

  15. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  16. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  17. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  18. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  19. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  20. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  1. A probabilistic storm surge risk model for the German North Sea and Baltic Sea coast

    NASA Astrophysics Data System (ADS)

    Grabbert, Jan-Henrik; Reiner, Andreas; Deepen, Jan; Rodda, Harvey; Mai, Stephan; Pfeifer, Dietmar

    2010-05-01

    The German North Sea coast is highly exposed to storm surges. Due to its concave bay-like shape mainly orientated to the North-West, cyclones from Western, North-Western and Northern directions together with astronomical tide cause storm surges accumulating the water in the German bight. Due to the existence of widespread low-lying areas (below 5m above mean sea level) behind the defenses, large areas including large economic values are exposed to coastal flooding including cities like Hamburg or Bremen. The occurrence of extreme storm surges in the past like e.g. in 1962 taking about 300 lives and causing widespread flooding and 1976 raised the awareness and led to a redesign of the coastal defenses which provide a good level of protection for today's conditions. Never the less the risk of flooding exists. Moreover an amplification of storm surge risk can be expected under the influence of climate change. The Baltic Sea coast is also exposed to storm surges, which are caused by other meteorological patterns. The influence of the astronomical tide is quite low instead high water levels are induced by strong winds only. Since the exceptional extreme event in 1872 storm surge hazard has been more or less forgotten. Although such an event is very unlikely to happen, it is not impossible. Storm surge risk is currently (almost) non-insurable in Germany. The potential risk is difficult to quantify as there are almost no historical losses available. Also premiums are difficult to assess. Therefore a new storm surge risk model is being developed to provide a basis for a probabilistic quantification of potential losses from coastal inundation. The model is funded by the GDV (German Insurance Association) and is planned to be used within the German insurance sector. Results might be used for a discussion of insurance cover for storm surge. The model consists of a probabilistic event driven hazard and a vulnerability module, furthermore an exposure interface and a financial

  2. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  3. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  4. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  5. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  6. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  7. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  8. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  9. NASA 2010 Pharmacology Evidence Review

    NASA Technical Reports Server (NTRS)

    Steinberg, Susan

    2011-01-01

    In 2008, the Institute of Medicine reviewed NASA's Human Research Program Evidence in assessing the Pharmacology risk identified in NASA's Human Research Program Requirements Document (PRD). Since this review there was a major reorganization of the Pharmacology discipline within the HRP, as well as a re-evaluation of the Pharmacology evidence. This panel is being asked to review the latest version of the Pharmacology Evidence Report. Specifically, this panel will: (1) Appraise the descriptions of the human health-related risk in the HRP PRD. (2) Assess the relevance and comprehensiveness of the evidence in identifying potential threats to long-term space missions. (3) Assess the associated gaps in knowledge and identify additional areas for research as necessary.

  10. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  11. NASA Case Sensitive Review and Audit Approach

    NASA Astrophysics Data System (ADS)

    Lee, Arthur R.; Bacus, Thomas H.; Bowersox, Alexandra M.; Newman, J. Steven

    2005-12-01

    As an Agency involved in high-risk endeavors NASA continually reassesses its commitment to engineering excellence and compliance to requirements. As a component of NASA's continual process improvement, the Office of Safety and Mission Assurance (OSMA) established the Review and Assessment Division (RAD) [1] to conduct independent audits to verify compliance with Agency requirements that impact safe and reliable operations. In implementing its responsibilities, RAD benchmarked various approaches for conducting audits, focusing on organizations that, like NASA, operate in high-risk environments - where seemingly inconsequential departures from safety, reliability, and quality requirements can have catastrophic impact to the public, NASA personnel, high-value equipment, and the environment. The approach used by the U.S. Navy Submarine Program [2] was considered the most fruitful framework for the invigorated OSMA audit processes. Additionally, the results of benchmarking activity revealed that not all audits are conducted using just one approach or even with the same objectives. This led to the concept of discrete, unique "audit cases."

  12. An empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  13. Issues in NASA program and project management

    NASA Technical Reports Server (NTRS)

    Hoban, Francis T. (Editor)

    1990-01-01

    This volume is the third in an ongoing series on aerospace project management at NASA. Articles in this volume cover the attitude of the program manager, program control and performance measurement, risk management, cost plus award fee contracting, lessons learned from the development of the Far Infrared Absolute Spectrometer (FIRAS), small projects management, and age distribution of NASA scientists and engineers. A section on resources for NASA managers rounds out the publication.

  14. Issues in NASA program and project management

    NASA Technical Reports Server (NTRS)

    Hoban, Francis T. (Editor)

    1991-01-01

    This volume is the third in an ongoing series on aerospace project management at NASA. Articles in this volume cover the attitude of the program manager, program control and performance measurement, risk management, cost plus award fee contracting, lessons learned from the development of the Far Infrared Absolute Spectrometer (FIRAS), small projects management, and age distribution of NASA scientists and engineers. A section on resources for NASA managers rounds out the publication.

  15. Probabilistic Geoacoustic Inversion in Complex Environments

    DTIC Science & Technology

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  16. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  17. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  18. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  19. Evidence-based risk communication: a systematic review.

    PubMed

    Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A

    2014-08-19

    Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.

  20. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  1. NASA pyrotechnically actuated systems program

    NASA Technical Reports Server (NTRS)

    Schulze, Norman R.

    1993-01-01

    The Office of Safety and Mission Quality initiated a Pyrotechnically Actuated Systems (PAS) Program in FY-92 to address problems experienced with pyrotechnically actuated systems and devices used both on the ground and in flight. The PAS Program will provide the technical basis for NASA's projects to incorporate new technological developments in operational systems. The program will accomplish that objective by developing/testing current and new hardware designs for flight applications and by providing a pyrotechnic data base. This marks the first applied pyrotechnic technology program funded by NASA to address pyrotechnic issues. The PAS Program has been structured to address the results of a survey of pyrotechnic device and system problems with the goal of alleviating or minimizing their risks. Major program initiatives include the development of a Laser Initiated Ordnance System, a pyrotechnic systems data base, NASA Standard Initiator model, a NASA Standard Linear Separation System and a NASA Standard Gas Generator. The PAS Program sponsors annual aerospace pyrotechnic systems workshops.

  2. bayesPop: Probabilistic Population Projections

    PubMed Central

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  3. Application of Probabilistic Methods to Assess Risk Due to Resonance in the Design of J-2X Rocket Engine Turbine Blades

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; DeHaye, Michael; DeLessio, Steven

    2011-01-01

    The LOX-Hydrogen J-2X Rocket Engine, which is proposed for use as an upper-stage engine for numerous earth-to-orbit and heavy lift launch vehicle architectures, is presently in the design phase and will move shortly to the initial development test phase. Analysis of the design has revealed numerous potential resonance issues with hardware in the turbomachinery turbine-side flow-path. The analysis of the fuel pump turbine blades requires particular care because resonant failure of the blades, which are rotating in excess of 30,000 revolutions/minutes (RPM), could be catastrophic for the engine and the entire launch vehicle. This paper describes a series of probabilistic analyses performed to assess the risk of failure of the turbine blades due to resonant vibration during past and present test series. Some significant results are that the probability of failure during a single complete engine hot-fire test is low (1%) because of the small likelihood of resonance, but that the probability increases to around 30% for a more focused turbomachinery-only test because all speeds will be ramped through and there is a greater likelihood of dwelling at more speeds. These risk calculations have been invaluable for use by program management in deciding if risk-reduction methods such as dampers are necessary immediately or if the test can be performed before the risk-reduction hardware is ready.

  4. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  5. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  6. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  7. NASA Engineering Safety Center NASA Aerospace Flight Battery Systems Working Group 2007 Proactive Task Status

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.

    2007-01-01

    In 2007, the NASA Engineering Safety Center (NESC) chartered the NASA Aerospace Flight Battery Systems Working Group to bring forth and address critical battery-related performance/manufacturing issues for NASA and the aerospace community. A suite of tasks identifying and addressing issues related to Ni-H2 and Li-ion battery chemistries was submitted and selected for implementation. The current NESC funded are: (1) Wet Life of Ni-H2 Batteries (2) Binding Procurement (3) NASA Lithium-Ion Battery Guidelines (3a) Li-Ion Performance Assessment (3b) Li-Ion Guidelines Document (3b-i) Assessment of Applicability of Pouch Cells for Aerospace Missions (3b-ii) High Voltage Risk Assessment (3b-iii) Safe Charge Rates for Li-Ion Cells (4) Availability of Source Material for Li-Ion Cells (5) NASA Aerospace Battery Workshop This presentation provides a brief overview of the tasks in the 2007 plan and serves as an introduction to more detailed discussions on each of the specific tasks.

  8. Nasa-wide Standard Administrative Systems

    NASA Technical Reports Server (NTRS)

    Schneck, P.

    1984-01-01

    Factors to be considered in developing agency-wide standard administrative systems for NASA include uniformity of hardware and software; centralization vs. decentralization; risk exposure; and models for software development.

  9. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  10. On a true value of risk

    NASA Astrophysics Data System (ADS)

    Kozine, Igor

    2018-04-01

    The paper suggests looking on probabilistic risk quantities and concepts through the prism of accepting one of the views: whether a true value of risk exists or not. It is argued that discussions until now have been primarily focused on closely related topics that are different from the topic of the current paper. The paper examines operational consequences of adhering to each of the views and contrasts them. It is demonstrated that operational differences on how and what probabilistic measures can be assessed and how they can be interpreted appear tangible. In particular, this concerns prediction intervals, the use of Byes rule, models of complete ignorance, hierarchical models of uncertainty, assignment of probabilities over possibility space and interpretation of derived probabilistic measures. Behavioural implications of favouring the either view are also briefly described.

  11. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  12. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map

  13. Error Discounting in Probabilistic Category Learning

    PubMed Central

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666

  14. Organizational Risk and Opportunity Management: Concepts and Processes for NASA's Consideration

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher

    2016-01-01

    The focus of this report is on the development of a framework and overall approach that serves the interests of nonprofit and Government organizations like NASA that focus on developing and/or applying new technology (henceforth referred to as organizations like NASA). These interests tend to place emphasis on performing services and achieving scientific and technical gains more than on achieving financial investment goals, which is the province of commercial enterprises. In addition, the objectives of organizations like NASA extend to institutional development and maintenance, financial health, legal and reputational protection, education and partnerships, and mandated milestone achievements. This report discusses the philosophical underpinnings of OROM for organizations like NASA, the integration of OROM with existing management processes, and the nature of the activities that are performed to implement OROM within this context. The proposed framework includes a set of core principles that would be essential to any successful OROM approach, along with some features that are currently under development and are continuing to evolve. The report is intended to foster discussion of OROM at NASA in order to reach a consensus on the optimum approach for the agency.

  15. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    PubMed

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  16. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair

    PubMed Central

    Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon

    2017-01-01

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768

  17. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    PubMed

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  18. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  19. Adequacy of the default values for skin surface area used for risk assessment and French anthropometric data by a probabilistic approach.

    PubMed

    Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C

    2017-08-01

    The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Structural Life and Reliability Metrics: Benchmarking and Verification of Probabilistic Life Prediction Codes

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Soditus, Sherry; Hendricks, Robert C.; Zaretsky, Erwin V.

    2002-01-01

    Over the past two decades there has been considerable effort by NASA Glenn and others to develop probabilistic codes to predict with reasonable engineering certainty the life and reliability of critical components in rotating machinery and, more specifically, in the rotating sections of airbreathing and rocket engines. These codes have, to a very limited extent, been verified with relatively small bench rig type specimens under uniaxial loading. Because of the small and very narrow database the acceptance of these codes within the aerospace community has been limited. An alternate approach to generating statistically significant data under complex loading and environments simulating aircraft and rocket engine conditions is to obtain, catalog and statistically analyze actual field data. End users of the engines, such as commercial airlines and the military, record and store operational and maintenance information. This presentation describes a cooperative program between the NASA GRC, United Airlines, USAF Wright Laboratory, U.S. Army Research Laboratory and Australian Aeronautical & Maritime Research Laboratory to obtain and analyze these airline data for selected components such as blades, disks and combustors. These airline data will be used to benchmark and compare existing life prediction codes.

  1. From Mice and Men to Earth and Space: Joint NASA-NCI Workshop on Lung Cancer Risk Resulting from Space and Terrestrial Radiation

    PubMed Central

    Shay, Jerry W.; Cucinotta, Francis A.; Sulzman, Frank M.; Coleman, C. Norman; Minna, John D.

    2011-01-01

    On June 27–28, 2011 scientists from the National Cancer Institute (NCI), NASA, and academia met in Bethesda to discuss major lung cancer issues confronting each organization. For NASA – available data suggest lung cancer is the largest potential cancer risk from space travel for both men and women and quantitative risk assessment information for mission planning is needed. In space the radiation risk is from high energy and charge (HZE) nuclei (such as Fe) and high energy protons from solar flares and not from gamma radiation. By contrast the NCI is endeavoring to estimate the increased lung cancer risk from the potential wide-spread implementation of computed tomography (CT) screening in individuals at high risk for developing lung cancer based on the National Lung Cancer Screening Trial (NLST). For the latter, exposure will be x-rays from CT scans from the screening (which uses “low dose” CT scans) and also from follow-up scans used to evaluate abnormalities found during initial screening. Topics discussed included the risk of lung cancer arising after HZE particle, proton, and low dose Earth radiation exposure. The workshop examined preclinical models, epidemiology, molecular markers, “omics” technology, radiobiology issues, and lung stem cells (LSC) that relate to the development of lung cancer. PMID:21900398

  2. From mice and men to earth and space: joint NASA-NCI workshop on lung cancer risk resulting from space and terrestrial radiation.

    PubMed

    Shay, Jerry W; Cucinotta, Francis A; Sulzman, Frank M; Coleman, C Norman; Minna, John D

    2011-11-15

    On June 27-28, 2011, scientists from the National Cancer Institute (NCI), NASA, and academia met in Bethesda to discuss major lung cancer issues confronting each organization. For NASA, available data suggest that lung cancer is the largest potential cancer risk from space travel for both men and women and quantitative risk assessment information for mission planning is needed. In space, the radiation risk is from high energy and charge (HZE) nuclei (such as Fe) and high-energy protons from solar flares and not from gamma radiation. In contrast, the NCI is endeavoring to estimate the increased lung cancer risk from the potential widespread implementation of computed tomographic (CT) screening in individuals at high risk for developing lung cancer based on the National Lung Cancer Screening Trial (NLST). For the latter, exposure will be X-rays from CT scans from the screening (which uses "low-dose" CT scans) and also from follow-up scans used to evaluate abnormalities found during initial screening. Topics discussed included the risk of lung cancer arising after HZE particle, proton, and low-dose exposure to Earth's radiation. The workshop examined preclinical models, epidemiology, molecular markers, "omics" technology, radiobiology issues, and lung stem cells that relate to the development of lung cancer. ©2011 AACR

  3. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  4. Probabilistic framework for the estimation of the adult and child toxicokinetic intraspecies uncertainty factors.

    PubMed

    Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S

    2003-12-01

    Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of

  5. An Updated Assessment of NASA Ultra-Efficient Engine Technologies

    NASA Technical Reports Server (NTRS)

    Tong Michael T.; Jones, Scott M.

    2005-01-01

    NASA's Ultra Efficient Engine Technology (UEET) project features advanced aeropropulsion technologies that include highly loaded turbomachinery, an advanced low-NOx combustor, high-temperature materials, and advanced fan containment technology. A probabilistic system assessment is performed to evaluate the impact of these technologies on aircraft CO2 (or equivalent fuel burn) and NOx reductions. A 300-passenger aircraft, with two 396-kN thrust (85,000-lb) engines is chosen for the study. The results show that a large subsonic aircraft equipped with the current UEET technology portfolio has very high probabilities of meeting the UEET minimum success criteria for CO2 reduction (-12% from the baseline) and LTO (landing and takeoff) NOx reductions (-65% relative to the 1996 International Civil Aviation Organization rule).

  6. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which

  7. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which

  8. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  9. Probabilistic objective functions for margin-less IMRT planning

    NASA Astrophysics Data System (ADS)

    Bohoslavsky, Román; Witte, Marnix G.; Janssen, Tomas M.; van Herk, Marcel

    2013-06-01

    We present a method to implement probabilistic treatment planning of intensity-modulated radiation therapy using custom software plugins in a commercial treatment planning system. Our method avoids the definition of safety-margins by directly including the effect of geometrical uncertainties during optimization when objective functions are evaluated. Because the shape of the resulting dose distribution implicitly defines the robustness of the plan, the optimizer has much more flexibility than with a margin-based approach. We expect that this added flexibility helps to automatically strike a better balance between target coverage and dose reduction for surrounding healthy tissue, especially for cases where the planning target volume overlaps organs at risk. Prostate cancer treatment planning was chosen to develop our method, including a novel technique to include rotational uncertainties. Based on population statistics, translations and rotations are simulated independently following a marker-based IGRT correction strategy. The effects of random and systematic errors are incorporated by first blurring and then shifting the dose distribution with respect to the clinical target volume. For simplicity and efficiency, dose-shift invariance and a rigid-body approximation are assumed. Three prostate cases were replanned using our probabilistic objective functions. To compare clinical and probabilistic plans, an evaluation tool was used that explicitly incorporates geometric uncertainties using Monte-Carlo methods. The new plans achieved similar or better dose distributions than the original clinical plans in terms of expected target coverage and rectum wall sparing. Plan optimization times were only about a factor of two higher than in the original clinical system. In conclusion, we have developed a practical planning tool that enables margin-less probability-based treatment planning with acceptable planning times, achieving the first system that is feasible for clinical

  10. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  11. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  12. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  13. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  14. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  15. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  16. Processing of probabilistic information in weight perception and motor prediction.

    PubMed

    Trampenau, Leif; van Eimeren, Thilo; Kuhtz-Buschbeck, Johann

    2017-02-01

    We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented.

  17. Probabilistic Assessment of a CMC Turbine Vane

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Brewer, Dave; Mital, Subodh K.

    2004-01-01

    In order to demonstrate the advanced CMC technology under development within the Ultra Efficient Engine Technology (UEET) program, it has been planned to fabricate, test and analyze an all CMC turbine vane made of a SiC/SiC composite material. The objective was to utilize a 5-II Satin Weave SiC/CVI SiC/ and MI SiC matrix material that was developed in-house under the Enabling Propulsion Materials (EPM) program, to design and fabricate a stator vane that can endure successfully 1000 hours of engine service conditions operation. The design requirements for the vane are to be able to withstand a maximum of 2400 F within the substrate and the hot surface temperature of 2700 F with the aid of an in-house developed Environmental/Thermal Barrier Coating (EBC/TBC) system. The vane will be tested in a High Pressure Burner Rig at NASA Glenn Research Center facility. This rig is capable of simulating the engine service environment. The present paper focuses on a probabilistic assessment of the vane. The material stress/strain relationship shows a bilinear behavior with a distinct knee corresponding to what is often termed as first matrix cracking strength. This is a critical life limiting consideration for these materials. The vane is therefore designed such that the maximum stresses are within this limit so that the structure is never subjected to loads beyond the first matrix cracking strength. Any violation of this design requirement is considered as failure. Probabilistic analysis is performed in order to determine the probability of failure based on this assumption. In the analysis, material properties, strength, and pressures are considered random variables. The variations in properties and strength are based on the actual experimental data generated in house. The mean values for the pressures on the upper surface and the lower surface are known but their distributions are unknown. In the present analysis the pressures are considered normally distributed with a nominal

  18. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    NASA Astrophysics Data System (ADS)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  19. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  20. A global empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.