Science.gov

Sample records for probabilistic risk-based management

  1. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  2. Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris

    2014-08-01

    We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.

  3. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  4. Microbial quality of reclaimed water for urban reuses: Probabilistic risk-based investigation and recommendations.

    PubMed

    Chhipi-Shrestha, Gyan; Hewage, Kasun; Sadiq, Rehan

    2017-01-15

    Although Canada has abundant freshwater resources, many cities still experience seasonal water shortage. Supply-side and demand-side management is a core strategy to address this water shortage. Under this strategy, reclaimed water, which the Canadian public is willing to use for non-potable purposes, is an option. However, no universal guidelines exist for reclaimed water use. Despite the federal government's long-term goal to develop guidelines for many water reuse applications, guidelines have only been prescribed for reclaimed water use in toilet and urinal flushing in Canada. At the provincial level, British Columbia (BC) has promulgated guidelines for wide applications of reclaimed water but only at broad class levels. This research has investigated and proposed probabilistic risk-based recommended values for microbial quality of reclaimed water in various non-potable urban reuses. The health risk was estimated by using quantitative microbial risk assessment. Two-dimensional Monte Carlo simulations were used in the analysis to include variability and uncertainty in input data. The proposed recommended values are based on the indicator organism E. coli. The required treatment levels for reuse were also estimated. In addition, the recommended values were successfully applied to three wastewater treatment effluents in the Okanagan Valley, BC, Canada. The health risks associated with other bacterial pathogens (Campylobacter jejuni and Salmonella spp.), virus (adenovirus, norovirus, and rotavirus), and protozoa (Cryptosporidium parvum and Giardia spp.), were also estimated. The estimated risks indicate the effectiveness of the E. coli-based water quality recommended values. Sensitivity analysis shows the pathogenic E. coli ratio and morbidity are the most sensitive input parameters for all water reuses. The proposed recommended values could be further improved by using national or regional data on water exposures, disease burden per case, and the susceptibility

  5. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    SciTech Connect

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  6. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  7. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  8. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  9. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  10. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  11. Uncertainty in environmental risk assessment: implications for risk-based management of river basins.

    PubMed

    Ragas, Ad M J; Huijbregts, Mark A J; Henning-de Jong, Irmgard; Leuven, Rob S E W

    2009-01-01

    Environmental risk assessment is typically uncertain due to different perceptions of the risk problem and limited knowledge about the physical, chemical, and biological processes underlying the risk. The present paper provides a systematic overview of the implications of different types of uncertainty for risk management, with a focus on risk-based management of river basins. Three different types of uncertainty are distinguished: 1) problem definition uncertainty, 2) true uncertainty, and 3) variability. Methods to quantify and describe these types of uncertainty are discussed and illustrated in 4 case studies. The case studies demonstrate that explicit regulation of uncertainty can improve risk management (e.g., by identification of the most effective risk reduction measures, optimization of the use of resources, and improvement of the decision-making process). It is concluded that the involvement of nongovernmental actors as prescribed by the European Union Water Framework Directive (WFD) provides challenging opportunities to address problem definition uncertainty and those forms of true uncertainty that are difficult to quantify. However, the WFD guidelines for derivation and application of environmental quality standards could be improved by the introduction of a probabilistic approach to deal with true uncertainty and a better scientific basis for regulation of variability.

  12. Towards risk-based drought management in the Netherlands: quantifying the welfare effects of water shortage

    NASA Astrophysics Data System (ADS)

    van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens

    2016-04-01

    It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some

  13. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  14. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  15. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can

  16. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  17. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  18. Method for Water Management Considering Long-term Probabilistic Forecasts

    NASA Astrophysics Data System (ADS)

    Hwang, J.; Kang, J.; Suh, A. S.

    2015-12-01

    This research is aimed at predicting the monthly inflow of the Andong-dam basin in South Korea using long-term probabilistic forecasts to apply long-term forecasts to water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  19. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  20. A risk-based framework for water resource management under changing water availability, policy options, and irrigation expansion

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2016-08-01

    resemble nonlinear functions of changes in individual drivers. The proposed risk-based framework can be linked to any water resource system assessment scheme to quantify the risk in system performance under changing conditions, with the larger goal of proposing alternative policy options to address future uncertainties and management concerns.

  1. The role of risk-based prioritization in total quality management

    SciTech Connect

    Bennett, C.T.

    1994-10-01

    The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approach - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.

  2. Risk-based requirements management framework with applications to assurance cases

    NASA Astrophysics Data System (ADS)

    Feng, D.; Eyster, C.

    The current regulatory approach for assuring device safety primarily focuses on compliance with prescriptive safety regulations and relevant safety standards. This approach, however, does not always lead to a safe system design even though safety regulations and standards have been met. In the medical device industry, several high profile recalls involving infusion pumps have prompted the regulatory agency to reconsider how device safety should be managed, reviewed and approved. An assurance case has been cited as a promising tool to address this growing concern. Assurance cases have been used in safety-critical systems for some time. Most assurance cases, if not all, in literature today are developed in an ad hoc fashion, independent from risk management and requirement development. An assurance case is a resource-intensive endeavor that requires additional effort and documentation from equipment manufacturers. Without a well-organized requirements infrastructure in place, such “ additional effort” can be substantial, to the point where the cost of adoption outweighs the benefit of adoption. In this paper, the authors present a Risk-Based Requirements and Assurance Management (RBRAM) methodology. The RBRAM is an elaborate framework that combines Risk-Based Requirements Management (RBRM) with assurance case methods. Such an integrated framework can help manufacturers leverage an existing risk management to present a comprehensive assurance case with minimal additional effort while providing a supplementary means to reexamine the integrity of the system design in terms of the mission objective. Although the example used is from the medical industry, the authors believe that the RBRAM methodology underlines the fundamental principle of risk management, and offers a simple, yet effective framework applicable to aerospace industry, perhaps, to any industry.

  3. Probabilistic spill occurrence simulation for chemical spills management.

    PubMed

    Cao, Weihua; Li, James; Joksimovic, Darko; Yuan, Arnold; Banting, Doug

    2013-11-15

    Inland chemical spills pose a great threat to water quality in worldwide area. A sophisticated probabilistic spill-event model that characterizes temporal and spatial randomness and quantifies statistical uncertainty due to limited spill data is a major component in spill management and associated decision making. This paper presents a MATLAB-based Monte Carlo simulation (MMCS) model for simulating the probabilistic quantifiable occurrences of inland chemical spills by time, magnitude, and location based on North America Industry Classification System codes. The model's aleatory and epistemic uncertainties were quantified through integrated bootstrap resampling technique. Benzene spills in the St. Clair River area of concern were used as a case to demonstrate the model by simulating spill occurrences, occurrence time, and mass expected for a 10-year period. Uncertainty analysis indicates that simulated spill characteristics can be described by lognormal distributions with positive skewness. The simulated spill time series will enable a quantitative risk analysis for water quality impairments due to the spills. The MMCS model can also help governments to evaluate their priority list of spilled chemicals.

  4. Achievements of risk-based produced water management on the Norwegian continental shelf (2002-2008).

    PubMed

    Smit, Mathijs G D; Frost, Tone K; Johnsen, Ståle

    2011-10-01

    In 1996, the Norwegian government issued a White Paper requiring the Norwegian oil industry to reach the goal of "zero discharge" for the marine environment by 2005. To achieve this goal, the Norwegian oil and gas industry initiated the Zero Discharge Programme for discharges of produced formation water from the hydrocarbon-containing reservoir, in close communication with regulators. The environmental impact factor (EIF), a risk-based management tool, was developed by the industry to quantify and document the environmental risks from produced water discharges. The EIF represents a volume of recipient water containing concentrations of one or more substances to a level exceeding a generic threshold for ecotoxicological effects. In addition, this tool facilitates the identification and selection of cost-effective risk mitigation measures. The EIF tool has been used by all operators on the Norwegian continental shelf since 2002 to report progress toward the goal of "zero discharge," interpreted as "zero harmful discharges," to the regulators. Even though produced water volumes have increased by approximately 30% between 2002 and 2008 on the Norwegian continental shelf, the total environmental risk from produced water discharges expressed by the summed EIF for all installations has been reduced by approximately 55%. The total amount of oil discharged to the sea has been reduced by 18% over the period 2000 to 2006. The experience from the Zero Discharge Programme shows that a risk-based approach is an excellent working tool to reduce discharges of potential harmful substances from offshore oil and gas installations.

  5. Dynamic Resource Management in Clouds: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Gonçalves, Paulo; Roy, Shubhabrata; Begin, Thomas; Loiseau, Patrick

    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this work we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. We show that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by “buzz/flash crowd effects” that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.

  6. Seasonal Water Resources Management and Probabilistic Operations Forecast in the San Juan Basin

    NASA Astrophysics Data System (ADS)

    Daugherty, L.; Zagona, E. A.; Rajagopalan, B.; Grantz, K.; Miller, W. P.; Werner, K.

    2013-12-01

    within the NWS Community Hydrologic Prediction System (CHPS) to produce an ensemble streamflow forecast. The ensemble traces are used to drive the MTOM with the initial conditions of the water resources system and the operating rules, to provide ensembles of water resources management and operation metrics. We applied this integrated approach to forecasting in the San Juan River Basin (SJRB) using a portion of the Colorado River MTOM. The management objectives in the basin include water supply for irrigation, tribal water rights, environmental flows, and flood control. The spring streamflow ensembles were issued at four different lead times on the first of each month from January - April, and are incorporated into the MTOM for the period 2002-2010. Ensembles of operational performance metrics for the SJRB such as Navajo Reservoir releases, end of water year storage, environmental flows and water supply for irrigation were computed and their skills evaluated against variables obtained in a baseline simulation using historical streamflow. Preliminary results indicate that thus obtained probabilistic forecasts may produce increased skill especially at long lead time (e.g., on Jan and Feb 1st). The probabilistic information for water management variables provide risks of system vulnerabilities and thus enables risk-based efficient planning and operations.

  7. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    SciTech Connect

    Huq, M; Palta, J; Dunscombe, P; Thomadsen, B

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapy process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what

  8. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

  9. Risk-based Inspection Scheduling Planning for Intelligent Agent in the Autonomous Fault Management

    SciTech Connect

    Hari Nugroho, Djoko; Sudarno

    2010-06-22

    This paper developed an autonomous fault management focusing to the inspection scheduling planning which was implemented to the advanced small nuclear reactor without on-site refuelling to assure the safety without human intervention. The inspection scheduling planning was developed optimally on the risk-based approach compromising between two important constraints related to the risk of action planning as such failure probability and shortest path. Performance was represented using computer simulation implemented to the DURESS components location and failure probability. It could be concluded that the first priority to be inspected was flow sensor FB2 which had the largest comparation value of 0.104233 comparing with the other components. The next route would be visited were sequentially FB1, FA2, FA1, FB, FA, VB, pump B, VA, pump A, VB2, VB1, VA2, VA1, reservoir 2, reservoir 1, FR2, and FR1. The movement route planning could be transferred to activate the robot arm which reflected as intelligent agent.

  10. Health Risk-Based Assessment and Management of Heavy Metals-Contaminated Soil Sites in Taiwan

    PubMed Central

    Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang

    2010-01-01

    Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan. PMID:21139851

  11. A Risk-Based Approach to Evaluating Wildlife Demographics for Management in a Changing Climate: A Case Study of the Lewis's Woodpecker

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyère, Cindy L.; Newlon, Karen R.

    2012-12-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker ( Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  12. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    USGS Publications Warehouse

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyere, Cindy L.; Newlon, Karen R.

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker (Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  13. National Drought Policy: Shifting the Paradigm from Crisis to Risk-based Management

    NASA Astrophysics Data System (ADS)

    Wilhite, D. A.; Sivakumar, M. K.; Stefanski, R.

    2011-12-01

    Drought is a normal part of climate for virtually all of the world's climatic regimes. To better address the risks associated with this hazard and societal vulnerability, there must be a dramatic paradigm shift in our approach to drought management in the coming decade in the light of the increasing frequency of droughts and projections of increased severity and duration of these events in the future for many regions, especially in the developing world. Addressing this challenge will require an improved awareness of drought as a natural hazard, the establishment of integrated drought monitoring and early warning systems, a higher level of preparedness that fully incorporates risk-based management, and the adoption of national drought policies that are directed at increasing the coping capacity and resilience of populations to future drought episodes. The World Meteorological Organization (WMO), in partnership with other United Nations' agencies, the National Drought Mitigation Center at the University of Nebraska, NOAA, the U.S. Department of Agriculture, and other partners, is currently launching a program to organize a High Level Meeting on National Drought Policy (HMNDP) in March 2013 to encourage the development of national drought policies through the development of a compendium of key policy elements. The key objectives of a national drought policy are to: (1) encourage vulnerable economic sectors and population groups to adopt self-reliant measures that promote risk management; (2) promote sustainable use of the agricultural and natural resource base; and (3) facilitate early recovery from drought through actions consistent with national drought policy objectives. The key elements of a drought policy framework are policy and governance, including political will; addressing risk and improving early warnings, including vulnerability analysis, impact assessment, and communication; mitigation and preparedness, including the application of effective and

  14. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    NASA Astrophysics Data System (ADS)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  15. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  16. How to Quantify Sustainable Development: A Risk-Based Approach to Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  17. How to quantify sustainable development: a risk-based approach to water quality management.

    PubMed

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  18. A probabilistic risk management based process for planning and management of technology development

    NASA Astrophysics Data System (ADS)

    Largent, Matthew Clinton

    In the current environment of limited research funding and evolving aerospace needs and requirements, the development of new technology is a critical process. Technologies are designed to meet specific system performance needs, but must be developed in order to reduce uncertainty associated with meeting the needs, as well as uncertainty regarding additional effects that the technology will have on the system. The development project will have risk associated with meeting budget and schedule requirements, and with the completion of the development project plan. Existing methods for technology development fall short of quantifying all areas of risk and uncertainty, and do not provide a method for linking the reduction of performance uncertainty with the management of cost, time, and project risk. This thesis introduces the Technology Development Planning and Management (TDPM) process, a structured process using probabilistic methods and risk management concepts to assist in the planning and management of technology development projects. The TDPM process focuses on planning activities to reduce the areas of performance uncertainty that have the largest effects on system level goals. The cost and schedule uncertainty and project risk associated with the project plan are quantified in order to allow informed management of the project plan and eventual development project. TDPM was implemented for two technology development examples. The first example focused on the implementation of the process for a simple technology development project, showcasing the ability to plan for uncertainty reduction, demonstrate the resulting effects on the system level, and still manage the project cost and schedule risk. The second example was performed by an experienced technology development manager, who implemented TDPM on the hypothetical development of a technology currently being studied. Through the examples, the TDPM process was shown to be a valid and useful tool that advances the

  19. FlySec: a risk-based airport security management system based on security as a service concept

    NASA Astrophysics Data System (ADS)

    Kyriazanos, Dimitris M.; Segou, Olga E.; Zalonis, Andreas; Thomopoulos, Stelios C. A.

    2016-05-01

    Complementing the ACI/IATA efforts, the FLYSEC European H2020 Research and Innovation project (http://www.fly-sec.eu/) aims to develop and demonstrate an innovative, integrated and end-to-end airport security process for passengers, enabling a guided and streamlined procedure from the landside to airside and into the boarding gates, and offering for an operationally validated innovative concept for end-to-end aviation security. FLYSEC ambition turns through a well-structured work plan into: (i) innovative processes facilitating risk-based screening; (ii) deployment and integration of new technologies and repurposing existing solutions towards a risk-based Security paradigm shift; (iii) improvement of passenger facilitation and customer service, bringing security as a real service in the airport of tomorrow;(iv) achievement of measurable throughput improvement and a whole new level of Quality of Service; and (v) validation of the results through advanced "in-vitro" simulation and "in-vivo" pilots. On the technical side, FLYSEC achieves its ambitious goals by integrating new technologies on video surveillance, intelligent remote image processing and biometrics combined with big data analysis, open-source intelligence and crowdsourcing. Repurposing existing technologies is also in the FLYSEC objectives, such as mobile application technologies for improved passenger experience and positive boarding applications (i.e. services to facilitate boarding and landside/airside way finding) as well as RFID for carry-on luggage tracking and quick unattended luggage handling. In this paper, the authors will describe the risk based airport security management system which powers FLYSEC intelligence and serves as the backend on top of which FLYSEC's front end technologies reside for security services management, behaviour and risk analysis.

  20. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health.

  1. Risk based bridge data collection and asset management and the role of structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Bush, Simon; Henning, Theunis; McCarten, Peter

    2011-04-01

    Bridges are critical to the operation and functionality of the whole road networks. It is therefore essential that specific data is collected regarding bridge asset condition and performance, as this allows proactive management of the assets and associated risks and more accurate short and long term financial planning. This paper proposes and discusses a strategy for collection of data on bridge condition and performance. Recognizing that risk management is the primary driver of asset management, the proposed strategy prioritizes bridges for levels of data collection including core, intermediate and advanced. Individual bridges are seen as parts of wider networks and bridge risk and criticality assessment emphasizes bridge failure or underperformance risk in the network context. The paper demonstrates how more reliable and detailed data can assist in managing network and bridge risks and provides a rationale for application of higher data collection levels for bridges characterized by higher risk and criticality. As the bridge risk and/or criticality increases planned and proactive integration of structural health monitoring (SHM) data into asset management is outlined. An example of bridge prioritization for data collection using several bridges taken from a national highway network is provided using an existing risk and criticality scoring methodology. The paper concludes with a discussion on the role of SHM in data collection for bridge asset management and where SHM can make the largest impacts.

  2. A risk-based approach to sanitary sewer pipe asset management.

    PubMed

    Baah, Kelly; Dubey, Brajesh; Harvey, Richard; McBean, Edward

    2015-02-01

    Wastewater collection systems are an important component of proper management of wastewater to prevent environmental and human health implications from mismanagement of anthropogenic waste. Due to aging and inadequate asset management practices, the wastewater collection assets of many cities around the globe are in a state of rapid decline and in need of urgent attention. Risk management is a tool which can help prioritize resources to better manage and rehabilitate wastewater collection systems. In this study, a risk matrix and a weighted sum multi-criteria decision-matrix are used to assess the consequence and risk of sewer pipe failure for a mid-sized city, using ArcGIS. The methodology shows that six percent of the uninspected sewer pipe assets of the case study have a high consequence of failure while four percent of the assets have a high risk of failure and hence provide priorities for inspection. A map incorporating risk of sewer pipe failure and consequence is developed to facilitate future planning, rehabilitation and maintenance programs. The consequence of failure assessment also includes a novel failure impact factor which captures the effect of structurally defective stormwater pipes on the failure assessment. The methodology recommended in this study can serve as a basis for future planning and decision making and has the potential to be universally applied by municipal sewer pipe asset managers globally to effectively manage the sanitary sewer pipe infrastructure within their jurisdiction.

  3. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability.

  4. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications.

  5. A Risk-based Assessment And Management Framework For Multipollutant Air Quality.

    PubMed

    Frey, H Christopher; Hubbell, Bryan

    2009-06-01

    The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management

  6. Assistance to the states with risk based data management. Quarterly technical progress report, April 1--June 30, 1995

    SciTech Connect

    Paque, M.J.

    1995-07-28

    The Tasks of this project are to: (1) complete implementation of a Risk Based Data Management System (RBDMS) in the States of Alaska, Mississippi, Montana, Nebraska; and (2) conduct Area of Review (AOR) Workshops in the states of California, Oklahoma, Kansas, and Texas. The RBDMS was designed to be a comprehensive database with the ability to expand into multiple areas, including oil and gas production. The database includes comprehensive well information for both producing and injection wells. It includes automated features for performing functions redated to AOR analyses, environmental risk analyses, well evaluation, permit evaluation, compliance monitoring, operator bonding assessments, operational monitoring and tracking, and more. This quarterly report describes the status of the development of the RBDMS project in both stated tasks and proposes further steps in its implementation.

  7. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas.

  8. Mobile human network management and recommendation by probabilistic social mining.

    PubMed

    Min, Jun-Ki; Cho, Sung-Bae

    2011-06-01

    Recently, inferring or sharing of mobile contexts has been actively investigated as cell phones have become more than a communication device. However, most of them focused on utilizing the contexts on social network services, while the means in mining or managing the human network itself were barely considered. In this paper, the SmartPhonebook, which mines users' social connections to manage their relationships by reasoning social and personal contexts, is presented. It works like an artificial assistant which recommends the candidate callees whom the users probably would like to contact in a certain situation. Moreover, it visualizes their social contexts like closeness and relationship with others in order to let the users know their social situations. The proposed method infers the social contexts based on the contact patterns, while it extracts the personal contexts such as the users' emotional states and behaviors from the mobile logs. Here, Bayesian networks are exploited to handle the uncertainties in the mobile environment. The proposed system has been implemented with the MS Windows Mobile 2003 SE Platform on Samsung SPH-M4650 smartphone and has been tested on real-world data. The experimental results showed that the system provides an efficient and informative way for mobile social networking.

  9. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  10. Application of risk-based assessment and management to riverbank filtration sites in India.

    PubMed

    Bartak, Rico; Page, Declan; Sandhu, Cornelius; Grischek, Thomas; Saini, Bharti; Mehrotra, Indu; Jain, Chakresh K; Ghosh, Narayan C

    2015-03-01

    This is the first reported study of a riverbank filtration (RBF) scheme to be assessed following the Australian Guidelines for Managed Aquifer Recharge. A comprehensive staged approach to assess the risks from 12 hazards to human health and the environment has been undertaken. Highest risks from untreated ground and Ganga River water were identified with pathogens, turbidity, iron, manganese, total dissolved solids and total hardness. Recovered water meets the guideline values for inorganic chemicals and salinity but exceeds limits for thermotolerant coliforms frequently. A quantitative microbial risk assessment undertaken on the water recovered from the aquifer indicated that the residual risks of 0.00165 disability-adjusted life years (DALYs) posed by the reference bacteria Escherichia coli O157:H7 were below the national diarrhoeal incidence of 0.027 DALYs and meet the health target in this study of 0.005 DALYs per person per year, which corresponds to the World Health Organization (WHO) regional diarrhoeal incidence in South-East Asia. Monsoon season was a major contributor to the calculated burden of disease and final DALYs were strongly dependent on RBF and disinfection pathogen removal capabilities. Finally, a water safety plan was developed with potential risk management procedures to minimize residual risks related to pathogens.

  11. Resolution of Probabilistic Weather Forecasts with Application in Disease Management.

    PubMed

    Hughes, G; McRoberts, N; Burnett, F J

    2017-02-01

    Predictive systems in disease management often incorporate weather data among the disease risk factors, and sometimes this comes in the form of forecast weather data rather than observed weather data. In such cases, it is useful to have an evaluation of the operational weather forecast, in addition to the evaluation of the disease forecasts provided by the predictive system. Typically, weather forecasts and disease forecasts are evaluated using different methodologies. However, the information theoretic quantity expected mutual information provides a basis for evaluating both kinds of forecast. Expected mutual information is an appropriate metric for the average performance of a predictive system over a set of forecasts. Both relative entropy (a divergence, measuring information gain) and specific information (an entropy difference, measuring change in uncertainty) provide a basis for the assessment of individual forecasts.

  12. A Risk-Based Approach to Manage Nutrient Contamination From Household Wastewater

    NASA Astrophysics Data System (ADS)

    Gold, A. J.; Sims, J. T.

    2001-05-01

    Nutrients originating from decentralized wastewater treatment systems (DWTS) can pose a risk to human and ecosystem health. Assessing the likelihood and magnitude of this risk is a formidable and complex challenge. However, a properly constructed risk assessment is essential if we are to design and implement practices for DWTS that minimize the impacts of nutrients on our environment. To do this successfully, we must carefully consider: (i) the specific risks posed by nutrients emitted by DWTS and the sensitivity of humans and ecosystems to these risks; (ii) the pathways by which nutrients move from DWTS to the sectors of the environment where the risk will occur (most often ground and surface waters); (iii) the micro and macro-scale processes that affect the transport and transformations of nutrients once they are emitted from the DWTS and how this in turn affects risk; and (iv) the effects of current or alternative DWTS design and management practices on nutrient transport and subsequent risks to humans and ecosystems. In this paper we examine the risks of nutrients from DWTS to human and ecosystem health at both the micro and the macro?level spatial scales. We focus primarily on the factors that control the movement of N and P from DWTS to ground and surface waters and the research needs related to controlling nonpoint source nutrient pollution from DWTS. At the micro?scale the exposure pathways include the system and the immediate surroundings, i.e., the subsurface environment near the DWTS. The exposed individual or ecosystem at the micro-scale can be a household well, lake, stream or estuary that borders an individual wastewater treatment system. At the macro?level our focus is at the aquifer and watershed scale and the risks posed to downstream ecosystems and water users by nonpoint source pollution of these waters by nutrients from DWTS. We analyze what is known about the effectiveness of current designs at mitigating these risks and our ability to predict

  13. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  14. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; Vesely, William; Youngblood, Robert

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  15. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  16. Towards risk-based drought management in the Netherlands: making water supply levels transparent to water users

    NASA Astrophysics Data System (ADS)

    Maat Judith, Ter; Marjolein, Mens; Vuren Saskia, Van; der Vat Marnix, Van

    2016-04-01

    Improving Predictions and Management of Hydrological Extremes (IMPREX), running from 2016-2019, a consortium of the Dutch research institute Deltares and the Dutch water management consultant HKV will design and build a tool to support quantitative risk-informed decision-making for fresh water management for the Netherlands, in particular the decision on water supply service levels. The research will be conducted in collaboration with the Dutch Ministry for Infrastructure and Environment, the Freshwater Supply Programme Office, the Dutch governmental organisation responsible for water management (Rijkswaterstaat), the Foundation for Applied Water Research, (STOWA, knowledge centre of the water boards) and a number of water boards. In the session we will present the conceptual framework for a risk-based approach for water shortage management and share thoughts on how the proposed tool can be applied in the Dutch water management context.

  17. Management of the Area 5 Radioactive Waste Management Site using Decision-based, Probabilistic Performance Assessment Modeling

    SciTech Connect

    Carilli, J.; Crowe, B.; Black, P.; Tauxe, J.; Stockton, T.; Catlett, K.; Yucel, V.

    2003-02-27

    Low-level radioactive waste from cleanup activities at the Nevada Test Site and from multiple sites across the U.S. Department of Energy (DOE) complex is disposed at two active Radioactive Waste Management Sites (RWMS) on the Nevada Test Site. These facilities, which are managed by the DOE National Nuclear Security Administration Nevada Site Office, were recently designated as one of two regional disposal centers and yearly volumes of disposed waste now exceed 50,000 m3 (> 2 million ft3). To safely and cost-effectively manage the disposal facilities, the Waste Management Division of Environmental Management has implemented decision-based management practices using flexible and problem-oriented probabilistic performance assessment modeling. Deterministic performance assessments and composite analyses were completed originally for the Area 5 and Area 3 RWMSs located in, respectively, Frenchman Flat and Yucca Flat on the Nevada Test Site. These documents provide the technical bases for issuance of disposal authorization statements for continuing operation of the disposal facilities. Both facilities are now in a maintenance phase that requires testing of conceptual models, reduction of uncertainty, and site monitoring all leading to eventual closure of the facilities and transition to long-term stewardship.

  18. Developing a risk-based trading scheme for cattle in England: farmer perspectives on managing trading risk for bovine tuberculosis.

    PubMed

    Little, R; Wheeler, K; Edge, S

    2017-02-11

    This paper examines farmer attitudes towards the development of a voluntary risk-based trading scheme for cattle in England as a risk mitigation measure for bovine tuberculosis (bTB). The research reported here was commissioned to gather evidence on the type of scheme that would have a good chance of success in improving the information farmers receive about the bTB risk of cattle they buy. Telephone interviews were conducted with a stratified random sample of 203 cattle farmers in England, splitting the interviews equally between respondents in the high-risk area and low-risk area for bTB. Supplementary interviews and focus groups with farmers were also carried out across the risk areas. Results suggest a greater enthusiasm for a risk-based trading scheme in low-risk areas compared with high-risk areas and among members of breed societies and cattle health schemes. Third-party certification of herds by private vets or the Animal and Plant Health Agency were regarded as the most credible source, with farmer self-certification being favoured by sellers, but being regarded as least credible by buyers. Understanding farmers' attitudes towards voluntary risk-based trading is important to gauge likely uptake, understand preferences for information provision and to assist in monitoring, evaluating and refining the scheme once established.

  19. Evaluating the impacts of agricultural land management practices on water resources: A probabilistic hydrologic modeling approach.

    PubMed

    Prada, A F; Chu, M L; Guzman, J A; Moriasi, D N

    2017-02-24

    Evaluating the effectiveness of agricultural land management practices in minimizing environmental impacts using models is challenged by the presence of inherent uncertainties during the model development stage. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the applicability and robustness of the model to properly represent future or alternative scenarios. The objective of this study was to develop a framework that facilitates model parameter selection while evaluating uncertainty to assess the impacts of land management practices at the watershed scale. The model framework was applied to the Lake Creek watershed located in southwestern Oklahoma, USA. A two-step probabilistic approach was implemented to parameterize the Agricultural Policy/Environmental eXtender (APEX) model using global uncertainty and sensitivity analysis to estimate the full spectrum of total monthly water yield (WYLD) and total monthly Nitrogen loads (N) in the watershed under different land management practices. Twenty-seven models were found to represent the baseline scenario in which uncertainty of up to 29% and 400% in WYLD and N, respectively, is plausible. Changing the land cover to pasture manifested the highest decrease in N to up to 30% for a full pasture coverage while changing to full winter wheat cover can increase the N up to 11%. The methodology developed in this study was able to quantify the full spectrum of system responses, the uncertainty associated with them, and the most important parameters that drive their variability. Results from this study can be used to develop strategic decisions on the risks and tradeoffs associated with different management alternatives that aim to increase productivity while also minimizing their environmental impacts.

  20. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  1. Geodata-based probabilistic risk assessment and management of pesticides in Germany: a conceptual framework.

    PubMed

    Schulz, Ralf; Stehle, Sebastian; Elsaesser, David; Matezki, Steffen; Müller, Alexandra; Neumann, Michael; Ohliger, Renja; Wogram, Jörn; Zenker, Katharina

    2009-01-01

    The procedure for the risk assessment of pesticides in Germany is currently further developed from a deterministic to a geodata-based probabilistic risk assessment (GeoPRA) approach. As the initial step, the exposure assessment for spray drift in permanent crops, such as vineyards, fruit orchards, and hops, is considered. In our concept, geoinformation tools are used to predict distribution functions for exposure concentrations based mainly on spatial information regarding the neighbourhood of crops and surface waters. A total number of 23 factors affecting the drift into surface waters were assessed and suggestions for their inclusion into the approach developed. The main objectives are to base the exposure estimation on a realistic representation of local landscape characteristics and on empirical results for the impact of each feature on the drift deposition. A framework for the identification of high-risk sites (active management areas [AMAs]) based on protection goals and ecological considerations was developed in order to implement suitable risk mitigation measures. The inclusion of active mitigation measures at sites with identified and verified risk is considered a central and important part of the overall assessment strategy. The suggested GeoPRA procedure itself is comprised of the following 4 steps, including elements of the extensive preliminary work conducted so far: 1) nationwide risk assessment, preferably based only on geodata-based factors; 2) identification of AMAs, including the spatial extension of contamination, the level of contamination, and the tolerable effect levels; 3) refined exposure assessment, using aerial photographs and field surveys; and 4) mitigation measures, with a focus on landscape-level active mitigation measures leading to effective risk reductions. The suggested GeoPRA procedure offers the possibility to actively involve the farming community in the process of pesticide management. Overall, the new procedure will aim at

  2. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  3. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.

  4. A risk-based decision tool for the management of organic waste in agriculture and farming activities (FARMERS).

    PubMed

    Río, Miguel; Franco-Uría, Amaya; Abad, Emilio; Roca, Enrique

    2011-01-30

    Currently, specific management guidelines must be implemented for guaranteeing the safe reuse of organic waste in agriculture. With that aim, this work was focused on the development of a decision support tool for a safe and sustainable management of cattle manure as fertiliser in pastureland, to control and limit metal accumulation in soil and to reduce metal biotransfer from soil to other compartments. The system was developed on the basis of an environmental risk assessment multi-compartmental model. In contrast to other management tools, a long-term dynamic modelling approach was selected considering the persistence of metals in the environment. A detailed description of the underlying flow equations which accounts for distribution, human exposure and risk characterisation of metals in the assessed scenario was presented, as well as model parameterization. The tool was implemented in Visual C++ and is structured on a data base, where all required data is stored, the risk assessment model and a GIS module for the visualization of the scenario characteristics and the results obtained (risk indexes). The decision support system allows choosing among three estimation options, depending on the needs of the user, which provide information to both farmers and policy makers. The first option is useful for evaluating the adequacy of the current management practices of the different farms, and the remaining ones provides information on the measures that can be taken to carry out a fertilising plan without exceeding risk to human health. Among other results, maximum values of application rates of manure, maximum permissible metal content of manure and maximum application times in a particular scenario can be estimated by this system. To illustrate tool application, a real case study with data corresponding to different farms of a milk production cooperative was presented.

  5. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  6. Undiscovered Locatable Mineral Resources in the Bay Resource Management Plan Area, Southwestern Alaska: A Probabilistic Assessment

    USGS Publications Warehouse

    Schmidt, J.M.; Light, T.D.; Drew, L.J.; Wilson, F.H.; Miller, M.L.; Saltus, R.W.

    2007-01-01

    The Bay Resource Management Plan (RMP) area in southwestern Alaska, north and northeast of Bristol Bay contains significant potential for undiscovered locatable mineral resources of base and precious metals, in addition to metallic mineral deposits that are already known. A quantitative probabilistic assessment has identified 24 tracts of land that are permissive for 17 mineral deposit model types likely to be explored for within the next 15 years in this region. Commodities we discuss in this report that have potential to occur in the Bay RMP area are Ag, Au, Cr, Cu, Fe, Hg, Mo, Pb, Sn, W, Zn, and platinum-group elements. Geoscience data for the region are sufficient to make quantitative estimates of the number of undiscovered deposits only for porphyry copper, epithermal vein, copper skarn, iron skarn, hot-spring mercury, placer gold, and placer platinum-deposit models. A description of a group of shallow- to intermediate-level intrusion-related gold deposits is combined with grade and tonnage data from 13 deposits of this type to provide a quantitative estimate of undiscovered deposits of this new type. We estimate that significant resources of Ag, Au, Cu, Fe, Hg, Mo, Pb, and Pt occur in the Bay Resource Management Plan area in these deposit types. At the 10th percentile probability level, the Bay RMP area is estimated to contain 10,067 metric tons silver, 1,485 metric tons gold, 12.66 million metric tons copper, 560 million metric tons iron, 8,100 metric tons mercury, 500,000 metric tons molybdenum, 150 metric tons lead, and 17 metric tons of platinum in undiscovered deposits of the eight quantified deposit types. At the 90th percentile probability level, the Bay RMP area is estimated to contain 89 metric tons silver, 14 metric tons gold, 911,215 metric tons copper, 330,000 metric tons iron, 1 metric ton mercury, 8,600 metric tons molybdenum and 1 metric ton platinum in undiscovered deposits of the eight deposit types. Other commodities, which may occur in the

  7. Groundwater contamination from waste management sites: The interaction between risk-based engineering design and regulatory policy: 1. Methodology

    NASA Astrophysics Data System (ADS)

    Massmann, Joel; Freeze, R. Allan

    1987-02-01

    This paper puts in place a risk-cost-benefit analysis for waste management facilities that explicitly recognizes the adversarial relationship that exists in a regulated market economy between the owner/operator of a waste management facility and the government regulatory agency under whose terms the facility must be licensed. The risk-cost-benefit analysis is set up from the perspective of the owner/operator. It can be used directly by the owner/operator to assess alternative design strategies. It can also be used by the regulatory agency to assess alternative regulatory policy, but only in an indirect manner, by examining the response of an owner/operator to the stimuli of various policies. The objective function is couched in terms of a discounted stream of benefits, costs, and risks over an engineering time horizon. Benefits are in the form of revenues for services provided; costs are those of construction and operation of the facility. Risk is defined as the cost associated with the probability of failure, with failure defined as the occurrence of a groundwater contamination event that violates the licensing requirements established for the facility. Failure requires a breach of the containment structure and contaminant migration through the hydrogeological environment to a compliance surface. The probability of failure can be estimated on the basis of reliability theory for the breach of containment and with a Monte-Carlo finite-element simulation for the advective contaminant transport. In the hydrogeological environment the hydraulic conductivity values are defined stochastically. The probability of failure is reduced by the presence of a monitoring network operated by the owner/operator and located between the source and the regulatory compliance surface. The level of reduction in the probability of failure depends on the probability of detection of the monitoring network, which can be calculated from the stochastic contaminant transport simulations. While

  8. The Effect of Forest Management Strategy on Carbon Storage and Revenue in Western Washington: A Probabilistic Simulation of Tradeoffs.

    PubMed

    Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J

    2017-01-01

    The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation.

  9. Network analysis of swine shipments in Ontario, Canada, to support disease spread modelling and risk-based disease management.

    PubMed

    Dorjee, S; Revie, C W; Poljak, Z; McNab, W B; Sanchez, J

    2013-10-01

    Understanding contact networks are important for modelling and managing the spread and control of communicable diseases in populations. This study characterizes the swine shipment network of a multi-site production system in southwestern Ontario, Canada. Data were extracted from a company's database listing swine shipments among 251 swine farms, including 20 sow, 69 nursery and 162 finishing farms, for the 2-year period of 2006 to 2007. Several network metrics were generated. The number of shipments per week between pairs of farms ranged from 1 to 6. The medians (and ranges) of out-degree were: sow 6 (1-21), nursery 8 (0-25), and finishing 0 (0-4), over the entire 2-year study period. Corresponding estimates for in-degree of nursery and finishing farms were 3 (0-9) and 3 (0-12) respectively. Outgoing and incoming infection chains (OIC and IIC), were also measured. The medians (ranges) of the monthly OIC and IIC were 0 (0-8) and 0 (0-6), respectively, with very similar measures observed for 2-week intervals. Nursery farms exhibited high measures of centrality. This indicates that they pose greater risks of disease spread in the network. Therefore, they should be given a high priority for disease prevention and control measures affecting all age groups alike. The network demonstrated scale-free and small-world topologies as observed in other livestock shipment studies. This heterogeneity in contacts among farm types and network topologies should be incorporated in simulation models to improve their validity. In conclusion, this study provided useful epidemiological information and parameters for the control and modelling of disease spread among swine farms, for the first time from Ontario, Canada.

  10. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  11. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils.

  12. State Assistance with Risk-Based Data Management: Inventory and needs assessment of 25 state Class II Underground Injection Control programs. Phase 1

    SciTech Connect

    Not Available

    1992-07-01

    As discussed in Section I of the attached report, state agencies must decide where to direct their limited resources in an effort to make optimum use of their available manpower and address those areas that pose the greatest risk to valuable drinking water sources. The Underground Injection Practices Research Foundation (UIPRF) proposed a risk-based data management system (RBDMS) to provide states with the information they need to effectively utilize staff resources, provide dependable documentation to justify program planning, and enhance environmental protection capabilities. The UIPRF structured its approach regarding environmental risk management to include data and information from production, injection, and inactive wells in its RBDMS project. Data from each of these well types is critical to the complete statistical evaluation of environmental risk and selected automated functions. This comprehensive approach allows state Underground Injection Control (UIC) programs to effectively evaluate the risk of contaminating underground sources of drinking water, while alleviating the additional work and associated problems that often arise when separate data bases are used. CH2M Hill and Digital Design Group, through a DOE grant to the UIPRF, completed an inventory and needs assessment of 25 state Class II UIC programs. The states selected for participation by the UIPRF were generally chosen based on interest and whether an active Class II injection well program was in place. The inventory and needs assessment provided an effective means of collecting and analyzing the interest, commitment, design requirements, utilization, and potential benefits of implementing a in individual state UIC programs. Personal contacts were made with representatives from each state to discuss the applicability of a RBDMS in their respective state.

  13. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    SciTech Connect

    Teixeira, F; Almeida, C de; Huq, M

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were used for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.

  14. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  15. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  16. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  17. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  18. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  19. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  20. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  1. A generic probabilistic framework for structural health prognostics and uncertainty management

    NASA Astrophysics Data System (ADS)

    Wang, Pingfeng; Youn, Byeng D.; Hu, Chao

    2012-04-01

    Structural health prognostics can be broadly applied to various engineered artifacts in an engineered system. However, techniques and methodologies for health prognostics become application-specific. This study thus aims at formulating a generic framework of structural health prognostics, which is composed of four core elements: (i) a generic health index system with synthesized health index (SHI), (ii) a generic offline learning scheme using the sparse Bayes learning (SBL) technique, (iii) a generic online prediction scheme using the similarity-based interpolation (SBI), and (iv) an uncertainty propagation map for the prognostic uncertainty management. The SHI enables the use of heterogeneous sensory signals; the sparseness feature employing only a few neighboring kernel functions enables the real-time prediction of remaining useful lives (RULs) regardless of data size; the SBI predicts the RULs with the background health knowledge obtained under uncertain manufacturing and operation conditions; and the uncertainty propagation map enables the predicted RULs to be loaded with their statistical characteristics. The proposed generic framework of structural health prognostics is thus applicable to different engineered systems and its effectiveness is demonstrated with two cases studies.

  2. A risk-based focused decision-management approach for justifying characterization of Hanford tank waste. June 1996, Revision 1; April 1997, Revision 2

    SciTech Connect

    Colson, S.D.; Gephart, R.E.; Hunter, V.L.; Janata, J.; Morgan, L.G.

    1997-12-31

    This report describes a disciplined, risk-based decision-making approach for determining characterization needs and resolving safety issues during the storage and remediation of radioactive waste stored in Hanford tanks. The strategy recommended uses interactive problem evaluation and decision analysis methods commonly used in industry to solve problems under conditions of uncertainty (i.e., lack of perfect knowledge). It acknowledges that problem resolution comes through both the application of high-quality science and human decisions based upon preferences and sometimes hard-to-compare choices. It recognizes that to firmly resolve a safety problem, the controlling waste characteristics and chemical phenomena must be measurable or estimated to an acceptable level of confidence tailored to the decision being made.

  3. DEVELOPMENT OF PROTOCOLS AND DECISION SUPPORT TOOLS FOR ASSESSING WATERSHED SYSTEM ASSIMILATIVE CAPACITY (SAC), IN SUPPORT OF RISK-BASED ECOSYSTEM MANAGEMENT/RESTORATION PRACTICES

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. This program is one component of the Office of Research and Development Ecosystem Protection Research Program. As part of this...

  4. Probabilistic Plan Management

    DTIC Science & Technology

    2009-11-17

    115 6.3 Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 6.4 Experimental Results...6.4.2 Comparison of Strengthening Strategies . . . . . . . . . . . . . . . . . . . 124 6.4.3 Effects of Global Strengthening...103 6.1 The baseline strengthening strategy explores the full search space of the different orderings of backfills, swapping and pruning steps that can

  5. A general risk-based adaptive management scheme incorporating the Bayesian Network Relative Risk Model with the South River, Virginia, as case study.

    PubMed

    Landis, Wayne G; Markiewicz, April J; Ayre, Kim K; Johns, Annie F; Harris, Meagan J; Stinson, Jonah M; Summers, Heather M

    2017-01-01

    Adaptive management has been presented as a method for the remediation, restoration, and protection of ecological systems. Recent reviews have found that the implementation of adaptive management has been unsuccessful in many instances. We present a modification of the model first formulated by Wyant and colleagues that puts ecological risk assessment into a central role in the adaptive management process. This construction has 3 overarching segments. Public engagement and governance determine the goals of society by identifying endpoints and specifying constraints such as costs. The research, engineering, risk assessment, and management section contains the decision loop estimating risk, evaluating options, specifying the monitoring program, and incorporating the data to re-evaluate risk. The 3rd component is the recognition that risk and public engagement can be altered by various externalities such as climate change, economics, technological developments, and population growth. We use the South River, Virginia, USA, study area and our previous research to illustrate each of these components. In our example, we use the Bayesian Network Relative Risk Model to estimate risks, evaluate remediation options, and provide lists of monitoring priorities. The research, engineering, risk assessment, and management loop also provides a structure in which data and the records of what worked and what did not, the learning process, can be stored. The learning process is a central part of adaptive management. We conclude that risk assessment can and should become an integral part of the adaptive management process. Integr Environ Assess Manag 2017;13:115-126. © 2016 SETAC.

  6. On the use of hierarchical probabilistic models for characterizing and managing uncertainty in risk/safety assessment.

    PubMed

    Kodell, Ralph L; Chen, James J

    2007-04-01

    A general probabilistically-based approach is proposed for both cancer and noncancer risk/safety assessments. The familiar framework of the original ADI/RfD formulation is used, substituting in the numerator a benchmark dose derived from a hierarchical pharmacokinetic/pharmacodynamic model and in the denominator a unitary uncertainty factor derived from a hierarchical animal/average human/sensitive human model. The empirical probability distributions of the numerator and denominator can be combined to produce an empirical human-equivalent distribution for an animal-derived benchmark dose in external-exposure units.

  7. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  8. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  9. Overview of the co-ordinated risk-based approach to science and management response and recovery for the 2012 eruptions of Tongariro volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, G. E.; Keys, H. J. R.; Procter, J. N.; Deligne, N. I.

    2014-10-01

    Tongariro volcano, New Zealand, lies wholly within the Tongariro National Park (TNP), one of New Zealand's major tourist destinations. Two small eruptions of the Te Maari vents on the northern flanks of Tongariro on 6 August 2012 and 21 November 2012 each produced a small ash cloud to < 8 km height accompanied by pyroclastic density currents and ballistic projectiles. The most popular day hike in New Zealand, the Tongariro Alpine Crossing (TAC), runs within 2 km of the Te Maari vents. The larger of the two eruptions (6 August 2012) severely impacted the TAC and resulted in its closure, impacting the local economic and potentially influencing national tourism. In this paper, we document the science and risk management response to the eruption, and detail how quantitative risk assessments were applied in a rapidly evolving situation to inform robust decision-making for when the TAC would be re-opened. The volcanologist and risk manager partnership highlights the value of open communication between scientists and stakeholders during a response to, and subsequent recovery from, a volcanic eruption.

  10. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    SciTech Connect

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  11. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  12. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  13. Assessment of environmental risks from toxic and nontoxic stressors; a proposed concept for a risk-based management tool for offshore drilling discharges.

    PubMed

    Smit, Mathijs G D; Jak, Robbert G; Rye, Henrik; Frost, Tone Karin; Singsaas, Ivar; Karman, Chris C

    2008-04-01

    In order to improve the ecological status of aquatic systems, both toxic (e.g., chemical) and nontoxic stressors (e.g., suspended particles) should be evaluated. This paper describes an approach to environmental risk assessment of drilling discharges to the sea. These discharges might lead to concentrations of toxic compounds and suspended clay particles in the water compartment and concentrations of toxic compounds, burial of biota, change in sediment structure, and oxygen depletion in marine sediments. The main challenges were to apply existing protocols for environmental risk assessment to nontoxic stressors and to combine risks arising from exposure to these stressors with risk from chemical exposure. The defined approach is based on species sensitivity distributions (SSDs). In addition, precautionary principles from the EU-Technical Guidance Document were incorporated to assure that the method is acceptable in a regulatory context. For all stressors a protocol was defined to construct an SSD for no observed effect concentrations (or levels; NOEC(L)-SSD) to allow for the calculation of the potentially affected fraction of species from predicted exposures. Depending on the availability of data, a NOEC-SSD for toxicants can either be directly based on available NOECs or constructed from the predicted no effect concentration and the variation in sensitivity among species. For nontoxic stressors a NOEL-SSD can be extrapolated from an SSD based on effect or field data. Potentially affected fractions of species at predicted exposures are combined into an overall risk estimate. The developed approach facilitates environmental management of drilling discharges and can be applied to define risk-mitigating measures for both toxic and nontoxic stress.

  14. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  15. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  16. Risk based management of invading plant disease

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Effective control of new and emerging plant disease remains a key challenge. Attempts to eradicate pathogens often involve removal of all plants within a fixed distance of detected infected hosts, targeting asymptomatic infection. Here we develop and test potentially more efficient, epidemiologicall...

  17. Risk Based Security Management at Research Reactors

    SciTech Connect

    Ek, David R.

    2015-09-01

    This presentation provides a background of what led to the international emphasis on nuclear security and describes how nuclear security is effectively implemented so as to preserve the societal benefits of nuclear and radioactive materials.

  18. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology.

  19. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties.

  20. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  1. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  2. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... credit risk capital requirement, its market risk capital requirement, and its operations risk capital... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT...

  3. Risk Based Inspection Pilot Study of Ignalina Nuclear Power Plant,Unit 2

    SciTech Connect

    Brickstad, Bjorn; Letzter, Adam; Klimasauskas, Arturas; Alzbutas, Robertas; Nedzinskas, Linas; Kopustinskas, Vytis

    2002-07-01

    A project with the acronym IRBIS (Ignalina Risk Based Inspection pilot Study) has been performed with the objective to perform a quantitative risk analysis of a total of 1240 stainless steel welds in Ignalina Nuclear Power Plant, unit 2 (INPP-2). The damage mechanism is IGSCC and the failure probabilities are quantified by using probabilistic fracture mechanics. The conditional core damage probabilities are taken from the plant PSA. (authors)

  4. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  5. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  6. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  7. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  8. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the

  9. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  10. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  11. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  12. Risk-based inservice testing program modifications at Palo Verde nuclear generating station

    SciTech Connect

    Knauf, S.; Lindenlaub, B.; Linthicum, R.

    1996-12-01

    Arizona Public Service Company (APS) is investigating changes to the Palo Verde Inservice Testing (IST) Program that are intended to result in the reduction of the required test frequency for various valves in the American Society of Mechanical Engineers (ASME) Section XI IST program. The analytical techniques employed to select candidate valves and to demonstrate that these frequency reductions are acceptable are risk based. The results of the Palo Verde probabilistic risk assessment (PRA), updated in June 1994, and the risk significant determination performed as part of the implementation efforts for 10 CFR 50.65 (the maintenance rule) were used to select candidate valves for extended test intervals. Additional component level evaluations were conducted by an `expert panel.` The decision to pursue these changes was facilitated by the ASME Risk-Based Inservice Testing Research Task Force for which Palo Verde is participating as a pilot plant. The NRC`s increasing acceptance of cost beneficial licensing actions and risk-based submittals also provided incentive to seek these changes. Arizona Public Service is pursuing the risk-based IST program modification in order to reduce the unnecessary regulatory burden of the IST program through qualitative and quantitative analysis consistent with maintaining a high level of plant safety. The objectives of this project at Palo Verde are as follows: (1) Apply risk-based technologies to IST components to determine their risk significance (i.e., high or low). (2) Apply a combination of deterministic and risk-based methods to determine appropriate testing requirements for IST components including improvement of testing methods and frequency intervals for high-risk significant components. (3) Apply risk-based technologies to high-risk significant components identified by the {open_quotes}expert panel{close_quotes} and outside of the IST program to determine whether additional testing requirements are appropriate.

  13. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  14. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  15. Risk based microbiological criteria for Campylobacter in broiler meat in the European Union.

    PubMed

    Nauta, Maarten J; Sanaa, Moez; Havelaar, Arie H

    2012-09-03

    Quantitative microbiological risk assessment (QMRA) allows evaluating the public health impact of food safety targets to support the control of foodborne pathogens. We estimate the risk reduction of setting microbiological criteria (MCs) for Campylobacter on broiler meat in 25 European countries, applying quantitative data from the 2008 EU baseline survey. We demonstrate that risk based MCs can be derived without explicit consideration of Food Safety Objectives or Performance Objectives. Published QMRA models for the consumer phase and dose response provide a relation between Campylobacter concentration on skin samples and the attending probability of illness for the consumer. Probabilistic modelling is used to evaluate a set of potential MCs. We present the percentage of batches not complying with the potential criteria, in relation to the risk reduction attending totally efficient treatment of these batches. We find different risk estimates and different impacts of MCs in different countries, which offers a practical and flexible tool for risk managers to select the most appropriate MC by weighing the costs (i.e. non-compliant batches) and the benefits (i.e. reduction in public health risk). Our analyses show that the estimated percentage of batches not complying with the MC is better correlated with the risk estimate than surrogate risk measures like the flock prevalence or the arithmetic mean concentration of bacteria on carcasses, and would therefore be a good measure for the risk of Campylobacter on broiler meat in a particular country. Two uncertain parameters in the model are the ratio of within- and between-flock variances in concentrations, and the transition factor of skin sample concentrations to concentrations on the meat. Sensitivity analyses show that these parameters have a considerable effect on our results, but the impact of their uncertainty is small compared to that of the parameters defining the Microbiological Criterion and the concentration

  16. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  17. A Probabilistic Model for Propagating Ungauged Basin Runoff Prediction Variability and Uncertainty Into Estuarine Water Quality Dynamics and Water Quality-Based Management Decisions

    NASA Astrophysics Data System (ADS)

    Anderson, R.; Gronewold, A.; Alameddine, I.; Reckhow, K.

    2008-12-01

    probabilistic modeling software program Analytica. This approach not only reflects uncertainty in parameter estimates but, by modeling the predicted daily runoff rate as a random variable, propagates that variability into the tidal prism model as well. The tidal prism model has the advantage of having only one hydrodynamic calibration parameter, the tidal exchange ratio (the ratio between the volume of water returning to an estuary on an incoming tide and the volume of water which exited the estuary on the previous outgoing tide). We estimate the tidal exchange ratio by calibrating the tidal prism model to salinity data using a Bayesian Markov chain Monte Carlo (MCMC) procedure and, as with other parameters, encode it as a random variable in the comprehensive model. We compare our results to those of a purely deterministic model, and find that intrinsic sources of variability in ungauged basin runoff predictions, when ignored, lead to pollutant concentration forecasts with unnecessarily large prediction intervals, and to potentially over-conservative management decisions. By demonstrating an innovative approach to capturing and explicitly acknowledging uncertainty in runoff model parameter estimates, our modeling approach serves as an ideal building block for future comprehensive model-based pollutant mitigation planning efforts in ungauged coastal watersheds, including those implemented through the US Environmental Protection Agency total maximum daily load program.

  18. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  19. Risk-based climate-change impact assessment for the water industry.

    PubMed

    Thorne, O M; Fenner, R A

    2009-01-01

    In response to a rapidly changing and highly variable climate, engineers are being asked to perform climate-change impact assessments on existing water industry systems. There is currently no single method of best practice for engineers to interpret output from global climate models (GCMs) and calculate probabilistic distributions of future climate changes as required for risk-based impact assessments. The simplified climate change impact assessment tool (SCIAT) has been developed to address the specific needs of the water industry and provides a tool to translate climate change projections into 'real world' impacts or for detailed statistical analysis. Through the use of SCIAT, water system operators are provided with knowledge of potential impacts and an associated probability of occurrence, enabling them to make informed, risk-based adaptation and planning decisions. This paper demonstrates the application of SCIAT to the consideration of the impacts of climate change on reservoir water quality under future climate scenarios.

  20. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  1. Probabilistic Causation without Probability.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…

  2. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  3. Risk-based SMA for Cubesats

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  4. Tools for Risk-Based UXO Remediation

    DTIC Science & Technology

    2014-01-01

    we (i) performed a probabilistic risk assessment using polarizabilities and ground truth information from Camp San Luis Obispo , Camp Butner, and...actual depth distribution of the UXO recovered at San Luis Obispo and results of the synthetic seed study, we conclude that all of the UXO, at least...same detection scheme, for burial depths of up to 0.77m. Thus, the detection process applied to ESTCP’s Classification Study at San Luis Obispo , CA

  5. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  6. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  7. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  8. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  9. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  10. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  11. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis.

  12. Risk based analysis: A rational approach to site cleanup

    SciTech Connect

    Arulanatham, R.; So, E.

    1994-12-31

    Soil and groundwater pollution in urban areas often can pose a threat to either human health or water quality or both. This soil and groundwater cleanup can be a very lengthy process and requires significant economic resources. The cleanup levels or requirements required by one agency sometimes do not match that required by the other agency, especially those for soil pollution. The involvement of several agencies at different times during the reclamation process has often diminished the cost-effectiveness of the reclamation efforts. In an attempt to bring some solutions to minimize this kind of problem (which has been experienced by both the authors) the staff of the Alameda County Department of Environmental Health and the Regional Water Quality Control Board, San Francisco Bay Region, has jointly developed some workable guidelines to self-assist the responsible parties in deriving target cleanup goals that are both human health (or other ecological receptor) and water quality protective. The following is a 6-step summary of the methodology to assist the responsible parties in properly managing their pollution problem. These guidelines include: (1) site characterization; (2) initial risk-based screening of contaminants; (3) derivation of health and/or ecological risk-based cleanup goals; (4) derivation of groundwater quality-based cleanup goals; (5) site cleanup goals and site remediation; and (6) risk management decisions.

  13. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  14. Risk-based regulation: A utility's perspective

    SciTech Connect

    Chapman, J.R. )

    1993-01-01

    Yankee Atomic Electric Company (YAEC) has supported the operation of several plants under the premise that regulations and corresponding implementation strategies are intended to be [open quotes]risk based.[close quotes] During the past 15 yr, these efforts have changed from essentially qualitative to a blend of qualitative and quantitative. Our observation is that implementation of regulatory requirements has often not addressed the risk significance of the underlying intent of regulations on a proportionate basis. It has caused our resource allocation to be skewed, to the point that our cost-competitiveness has eroded, but more importantly we have missed opportunities for increases in safety.

  15. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  16. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  17. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  18. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  19. Is the basic conditional probabilistic?

    PubMed

    Goodwin, Geoffrey P

    2014-06-01

    Nine experiments examined whether individuals treat the meaning of basic conditional assertions as deterministic or probabilistic. In Experiments 1-4, participants were presented with either probabilistic or deterministic relations, which they had to describe with a conditional. These experiments consistently showed that people tend only to use the basic if p then q construction to describe deterministic relations between antecedent and consequent, whereas they use a probabilistically qualified construction, if p then probably q, to describe probabilistic relations-suggesting that the default interpretation of the conditional is deterministic. Experiments 5 and 6 showed that when directly asked, individuals typically report that conditional assertions admit no exceptions (i.e., they are seen as deterministic). Experiments 7-9 showed that individuals judge the truth of conditional assertions in accordance with this deterministic interpretation. Together, these results pose a challenge to probabilistic accounts of the meaning of conditionals and support mental models, formal rules, and suppositional accounts.

  20. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  1. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  2. Risk-based targeting: A new approach in environmental protection

    SciTech Connect

    Fox, C.A.

    1995-12-31

    Risk-based targeting has recently emerged as an effective tool to help prioritize efforts to identify and manage geographic areas, chemicals, facilities, and agricultural activities that cause the most environmental degradation. This paper focuses on how the Environmental Protection Agency (EPA) has recently used risk-based targeting to identify and screen Federal, industrial, commercial and municipal facilities which contribute to probable human health (fish consumption advisories and contaminated fish tissue) and aquatic life (contaminated sediments) impacts. Preliminary results identified several hundred potential contributors of problem chemicals to probable impacts within the same river reach in 1991--93. Analysis by industry sector showed that the majority of the facilities identified were publicly owned treatment works (POTWs), in addition to industry organic and inorganic chemical manufacturers, petroleum refineries, and electric services, coatings, engravings, and allied services, among others. Both compliant and non-compliant potentially contributing facilities were identified to some extent in all EPA regions. Additional results identifying possible linkages of other pollutant sources to probable impacts, as well as estimation of potential exposure of these contaminants to minority and/or poverty populations are also presented. Out of these analyses, a number of short and long-term strategies are being developed that EPA may use to reduce loadings of problem contaminants to impacted waterbodies.

  3. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    PubMed

    Vilaprinyo, Ester; Forné, Carles; Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  4. A probabilistic prediction network for hydrological drought identification and environmental flow assessment

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyong; Törnros, Tobias; Menzel, Lucas

    2016-08-01

    A general probabilistic prediction network is proposed for hydrological drought examination and environmental flow assessment. This network consists of three major components. First, we present the joint streamflow drought indicator (JSDI) to describe the hydrological dryness/wetness conditions. The JSDI is established based on a high-dimensional multivariate probabilistic model. In the second part, a drought-based environmental flow assessment method is introduced, which provides dynamic risk-based information about how much flow (the environmental flow target) is required for drought recovery and its likelihood under different hydrological drought initial situations. The final part involves estimating the conditional probability of achieving the required environmental flow under different precipitation scenarios according to the joint dependence structure between streamflow and precipitation. Three watersheds from different countries (Germany, China, and the United States) with varying sizes from small to large were used to examine the usefulness of this network. The results show that the JSDI can provide an assessment of overall hydrological dryness/wetness conditions and performs well in identifying both drought onset and persistence. This network also allows quantitative prediction of targeted environmental flow required for hydrological drought recovery and estimation of the corresponding likelihood. Moreover, the results confirm that the general network can estimate the conditional probability associated with the required flow under different precipitation scenarios. The presented methodology offers a promising tool for water supply planning and management and for drought-based environmental flow assessment. The network has no restrictions that would prevent it from being applied to other basins worldwide.

  5. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    NASA Astrophysics Data System (ADS)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  6. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  7. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  8. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  9. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  10. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  11. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  12. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  13. Common Difficulties with Probabilistic Reasoning.

    ERIC Educational Resources Information Center

    Hope, Jack A.; Kelly, Ivan W.

    1983-01-01

    Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)

  14. Regional Variability of Stream Responses to Urbanization: Implications for Risk-Based Assessments

    NASA Astrophysics Data System (ADS)

    Bledsoe, B. P.; Dust, D. W.; Hawley, R. J.

    2007-12-01

    Predictive scientific assessments of the geomorphic consequences of urbanization must be calibrated to the regional hydroclimatological, geologic, and historical context in which streams occur. We present examples of context-specific stream responses to hydromodification, and a general framework for risk-based modeling and scientific assessment of hydrologic-geomorphic-ecologic linkages in urbanizing watersheds. The framework involves: 1) a priori stratification of a region's streams based on flow regime, geomorphic context and susceptibility to changes in water, sediment, and wood regimes, 2) field surveys across a gradient of urban influence, 3) coupling long term hydrologic simulation with geomorphic analysis to quantify key hydrogeomorphic metrics, and 4) using probabilistic modeling to identify regional linkages between hydrogeomorphic descriptors and decision endpoints of primary interest to stakeholders and decision-makers.

  15. A Probabilistic Ontology Development Methodology

    DTIC Science & Technology

    2014-06-01

    to have a tool guiding the user on the steps necessary to create a probabilistic ontology and link this documentation to its implementation … [4...extension that is beyond the scope of this work and includes methods such as ONIONS , FCA-Merge, and PROMPT. The interested reader may find these...construction “It would be interesting to have a tool guiding the user on the steps necessary to create a probabilistic ontology and link this

  16. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  17. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  18. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive.

    PubMed

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  19. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  20. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  1. Risk-based modeling of early warning systems for pollution accidents.

    PubMed

    Grayman, W M; Males, R M

    2002-01-01

    An early warning system is a mechanism for detecting, characterizing and providing notification of a source water contamination event (spill event) in order to mitigate the impact of contamination. Spill events are highly probabilistic occurrences with major spills, which can have very significant impacts on raw water sources of drinking water, being relatively rare. A systematic method for designing and operating early warning systems that considers the highly variable, probabilistic nature of many aspects of the system is described. The methodology accounts for the probability of spills, behavior of monitoring equipment, variable hydrology, and the probability of obtaining information about spills independent of a monitoring system. Spill Risk, a risk-based model using Monte Carlo simulation techniques has been developed and its utility has been demonstrated as part of an AWWA Research Foundation sponsored project. The model has been applied to several hypothetical river situations and to an actual section of the Ohio River. Additionally, the model has been systematically applied to a wide range of conditions in order to develop general guidance on design of early warning systems.

  2. Bounded Error Approximation Algorithms for Risk-Based Intrusion Response

    DTIC Science & Technology

    2015-09-17

    AFRL-AFOSR-VA-TR-2015-0324 Bounded Error Approximation Algorithms for Risk-Based Intrusion Response K Subramani West Virginia University Research...2015. 4. TITLE AND SUBTITLE Bounded Error Approximation Algorithms for Risk-Based Intrusion Response 5a. CONTRACT NUMBER FA9550-12-1-0199. 5b. GRANT... Algorithms for Risk-Based Intrusion Response DISTRIBUTION A: Distribution approved for public release. Definition 1.7 Given an integer k, an undirected

  3. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  4. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches.

  5. Risk-Based Decision Support of Water Resource Management Alternatives

    DTIC Science & Technology

    2006-12-01

    22.5 kilometers) in Pennsylvania and Maryland, was created in 1928 with the completion of the Conowingo dam . [1] The Conowingo system gradually...The Conowingo Dam is one of four hydroelectric projects on the lower Susquehanna River. All are regulated by the Federal Energy Regulatory Commission...components. In the Conowingo Dam system, the dam itself is a structural component; turbines, flood gates are related equipment are operating components

  6. Risk-Based Data Management System design specifications and implementation plan for the Alaska Oil and Gas Conservation Commission; the Mississippi State Oil and Gas Board; the Montana Board of Oil and Gas Conservation; and the Nebraska Oil and Gas Conservation Commission

    SciTech Connect

    Not Available

    1993-09-01

    The purpose of this document is to present design specifications and an implementation schedule for the development and implementation of Risk Based Data Management Systems (RBDMS`s) in the states of Alaska, Mississippi, Montana, and Nebraska. The document presents detailed design information including a description of the system database structure, data dictionary, data entry and inquiry screen layouts, specifications for standard reports that will be produced by the system, functions and capabilities (including environmental risk analyses), And table relationships for each database table within the system. This design information provides a comprehensive blueprint of the system to be developed and presents the necessary detailed information for system development and implementation. A proposed schedule for development and implementation also is presented. The schedule presents timeframes for the development of system modules, training, implementation, and providing assistance to the states with data conversion from existing systems. However, the schedule will vary depending upon the timing of funding allocations from the United States Department of Energy (DOE) for the development and implementation phase of the project. For planning purposes, the schedule assumes that initiation of the development and implementation phase will commence November 1, 1993, somewhat later than originally anticipated.

  7. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  8. An expert system for probabilistic description of loads on space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Spencer, B. F., Jr.; Hopkins, D. A.

    1988-01-01

    LDEXPT, an expert system that generates probabilistic characterizations of the loads spectra borne by spacecraft propulsion systems' structural components, is found by recent experience at NASA-Lewis to be useful in the cases of components representative of the Space Shuttle Main Engine's turbopumps and fluid transfer ducting. LDEXPT is composed of a knowledge base management system and a rule base management system. The ANLOAD load-modeling module of LDEXPT encompasses three independent probabilistic analysis techniques.

  9. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  10. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  11. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1994-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables that describe the truss. Initially, the truss is deterministically analyzed for member forces, and members in which the axial force exceeds the Euler buckling load are identified. These members are then discretized with several intermediate nodes, and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and the respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing buckled members until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  12. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  13. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... and 225 Federal Deposit Insurance Corporation 12 CFR Part 325 Risk-Based Capital Guidelines: Market... CORPORATION 12 CFR Part 325 RIN 3064-AD70 Risk-Based Capital Guidelines: Market Risk AGENCY: Office of the... proposal to revise their market risk capital rules to modify their scope to better capture positions...

  14. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk. The... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL...

  15. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  16. Probabilistic inversion: a preliminary discussion

    NASA Astrophysics Data System (ADS)

    Battista Rossi, Giovanni; Crenna, Francesco

    2015-02-01

    We continue the discussion on the possibility of interpreting probability as a logic, that we have started in the previous IMEKO TC1-TC7-TC13 Symposium. We show here how a probabilistic logic can be extended up to including direct and inverse functions. We also discuss the relationship between this framework and the Bayes-Laplace rule, showing how the latter can be formally interpreted as a probabilistic inversion device. We suggest that these findings open a new perspective in the evaluation of measurement uncertainty.

  17. Probabilistic forecasts based on radar rainfall uncertainty

    NASA Astrophysics Data System (ADS)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.

  18. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  19. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  20. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  1. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  2. On the applicability of probabilistics

    SciTech Connect

    Roth, P.G.

    1996-12-31

    GEAE`s traditional lifing approach, based on Low Cycle Fatigue (LCF) curves, is evolving for fracture critical powder metal components by incorporating probabilistic fracture mechanics analysis. Supporting this move is a growing validation database which convincingly demonstrates that probabilistics work given the right inputs. Significant efforts are being made to ensure the right inputs. For example, Heavy Liquid Separation (HLS) analysis has been developed to quantify and control inclusion content (1). Also, an intensive seeded fatigue program providing a model for crack initiation at inclusions is ongoing (2). Despite the optimism and energy, probabilistics are only tools and have limitations. Designing to low failure probabilities helps provide protection, but other strategies are needed to protect against surprises. A low risk design limit derived from a predicted failure distribution can lead to a high risk deployment if there are unaccounted-for deviations from analysis assumptions. Recognized deviations which are statistically quantifiable can be integrated into the probabilistic analysis (an advantage of the approach). When deviations are known to be possible but are not properly describable statistically, it may be more appropriate to maintain the traditional position of conservatively bounding relevant input parameters. Finally, safety factors on analysis results may be called for in cases where there is little experience supporting new design concepts or material applications (where unrecognized deviations might be expected).

  3. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  4. Toward a probabilistic definition of seizures.

    PubMed

    Osorio, Ivan; Lyubushin, Alexey; Sornette, Didier

    2011-12-01

    This writing (1) draws attention to the intricacies inherent to the pursuit of a universal seizure definition even when powerful, well-understood signal analysis methods are used to this end; (2) identifies this aim as a multi-objective optimization problem and discusses the advantages and disadvantages of adopting or rejecting a unitary seizure definition; and (3) introduces a probabilistic measure of seizure activity to manage this thorny issue. The challenges posed by the attempt to define seizures unitarily may be partly related to their fractal properties and understood through a simplistic analogy to the so-called "Richardson effect." A revision of the time-honored conceptualization of seizures may be warranted to further advance epileptology. This article is part of a Supplemental Special Issue entitled The Future of Automated Seizure Detection and Prediction.

  5. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS.

    PubMed

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-05-18

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013-2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes.

  6. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  7. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  8. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  9. Aggregation of Trust for Iterated Belief Revision in Probabilistic Logics

    NASA Astrophysics Data System (ADS)

    Pardo, Pere

    In this paper it is shown how communication about trust in a multi-agent system may be used to endow agents with belief change capabilities, in a probabilistic logical framework. Belief change operators are obtained in an intuitive, principled way using aggregation operators for trust-values. Under additional conditions, such change operators may be proved to be maxichoice. The present approach constitutes a sound method for autonomous uncertainty management in multi-agent systems.

  10. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  11. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  12. Applications of Probabilistic Risk Assessment

    SciTech Connect

    Burns, K.J.; Chapman, J.R.; Follen, S.M.; O'Regan, P.J. )

    1991-05-01

    This report provides a summary of potential and actual applications of Probabilistic Risk Assessment (PRA) technology and insights. Individual applications are derived from the experiences of a number of US nuclear utilities. This report identifies numerous applications of PRA techniques beyond those typically associated with PRAs. In addition, believing that the future use of PRA techniques should not be limited to those of the past, areas of plant operations, maintenance, and financial resource allocation are discussed. 9 refs., 3 tabs.

  13. Application impact analysis: a risk-based approach to business continuity and disaster recovery.

    PubMed

    Epstein, Beth; Khan, Dawn Christine

    2014-01-01

    There are many possible disruptions that can occur in business. Overlooking or under planning for Business Continuity requires time, understanding and careful planning. Business Continuity Management is far more than producing a document and declaring business continuity success. What is the recipe for businesses to achieve continuity management success? Application Impact Analysis is a method for understanding the unique Business Attributes. This AIA Cycle involves a risk based approach to understanding the business priority and considering business aspects such as Financial, Operational, Service Structure, Contractual Legal, and Brand. The output of this analysis provides a construct for viewing data, evaluating impact, and delivering results, for an approved valuation of Recovery Time Objectives (RTO).

  14. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  15. Risk-based maintenance--techniques and applications.

    PubMed

    Arunraj, N S; Maiti, J

    2007-04-11

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions.

  16. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  17. Probabilistic Analysis of Ground-Holding Strategies

    NASA Technical Reports Server (NTRS)

    Sheel, Minakshi

    1997-01-01

    The Ground-Holding Policy Problem (GHPP) has become a matter of great interest in recent years because of the high cost incurred by aircraft suffering from delays. Ground-holding keeps a flight on the ground at the departure airport if it is known it will be unable to land at the arrival airport. The GBPP is determining how many flights should be held on the ground before take-off and for how long, in order to minimize the cost of delays. When the uncertainty associated with airport landing capacity is considered, the GHPP becomes complicated. A decision support system that incorporates this uncertainty, solves the GHPP quickly, and gives good results would be of great help to air traffic management. The purpose of this thesis is to modify and analyze a probabilistic ground-holding algorithm by applying it to two common cases of capacity reduction. A graphical user interface was developed and sensitivity analysis was done on the algorithm, in order to see how it may be implemented in practice. The sensitivity analysis showed the algorithm was very sensitive to the number of probabilistic capacity scenarios used and to the cost ratio of air delay to ground delay. The algorithm was not particularly sensitive to the number of periods that the time horizon was divided into. In terms of cost savings, a ground-holding policy was the most beneficial when demand greatly exceeded airport capacity. When compared to other air traffic flow strategies, the ground-holding algorithm performed the best and was the most consistent under various situations. The algorithm can solve large problems quickly and efficiently on a personal computer.

  18. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  19. Health economics and outcomes methods in risk-based decision-making for blood safety.

    PubMed

    Custer, Brian; Janssen, Mart P

    2015-08-01

    Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making.

  20. Auxiliary feedwater system risk-based inspection guide for the South Texas Project nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1993-12-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. South Texas Project was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the South Texas Project plant.

  1. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant.

  2. Auxiliary feedwater system risk-based inspection guide for the H. B. Robinson nuclear power plant

    SciTech Connect

    Moffitt, N.E.; Lloyd, R.C.; Gore, B.F.; Vo, T.V.; Garner, L.W.

    1993-08-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. H. B. Robinson was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the H. B. Robinson plant.

  3. Auxiliary feedwater system risk-based inspection guide for the Ginna Nuclear Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.; Moffitt, N.E. )

    1991-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Ginna was selected as the eighth plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Ginna plant. 23 refs., 1 fig., 1 tab.

  4. Auxiliary feedwater system risk-based inspection guide for the Point Beach nuclear power plant

    SciTech Connect

    Lloyd, R C; Moffitt, N E; Gore, B F; Vo, T V; Vehec, T A

    1993-02-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Point Beach was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRS. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Point Beach plant.

  5. Auxiliary feedwater system risk-based inspection guide for the Byron and Braidwood nuclear power plants

    SciTech Connect

    Moffitt, N.E.; Gore, B.F.: Vo, T.V. )

    1991-07-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Byron and Braidwood were selected for the fourth study in this program. The produce of this effort is a prioritized listing of AFW failures which have occurred at the plants and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Byron/Braidwood plants. 23 refs., 1 fig., 1 tab.

  6. Risk-based inspection priorities for PWR high-pressure injection system components

    SciTech Connect

    Vo, T.V.; Simonen, F.A.; Phan, H.K. )

    1993-01-01

    Under U.S. Nuclear Regulatory Commission sponsorship, Pacific Northwest Laboratory developed a risk-based method that can be used to establish in-service inspection priorities for nuclear power plant components. The overall goal of this effort was to develop technical bases for improvements of inspection plans and to provide recommendations for revisions of the American Society of Mechanical Engineers Boiler and Pressure Vessel Code, Sec. XI. The developed method used results of probabilistic risk assessment in combination with the failure modes and effects analysis (FMEA) technique to establish in-service inspection priorities for systems and components. The Surry nuclear power station, unit 1 (Surry-1) was selected for study. Inspection priorities for several pressure boundary systems at Surry-1 were determined in the early phase of the project. To complete the study, the remaining safety systems, plus balance of plant, have been analyzed; one of these is the high-pressure injection (HPI) system. This paper presents the results of inspection priorities for the HPI system.

  7. Mixed deterministic and probabilistic networks.

    PubMed

    Mateescu, Robert; Dechter, Rina

    2008-11-01

    The paper introduces mixed networks, a new graphical model framework for expressing and reasoning with probabilistic and deterministic information. The motivation to develop mixed networks stems from the desire to fully exploit the deterministic information (constraints) that is often present in graphical models. Several concepts and algorithms specific to belief networks and constraint networks are combined, achieving computational efficiency, semantic coherence and user-interface convenience. We define the semantics and graphical representation of mixed networks, and discuss the two main types of algorithms for processing them: inference-based and search-based. A preliminary experimental evaluation shows the benefits of the new model.

  8. Mixed deterministic and probabilistic networks

    PubMed Central

    Dechter, Rina

    2010-01-01

    The paper introduces mixed networks, a new graphical model framework for expressing and reasoning with probabilistic and deterministic information. The motivation to develop mixed networks stems from the desire to fully exploit the deterministic information (constraints) that is often present in graphical models. Several concepts and algorithms specific to belief networks and constraint networks are combined, achieving computational efficiency, semantic coherence and user-interface convenience. We define the semantics and graphical representation of mixed networks, and discuss the two main types of algorithms for processing them: inference-based and search-based. A preliminary experimental evaluation shows the benefits of the new model. PMID:20981243

  9. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  10. Probabilistic approach to EMP assessment

    SciTech Connect

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program.

  11. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  12. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  13. Development of a risk-based approach to Hanford Site cleanup

    SciTech Connect

    Hesser, W.A.; Daling, P.M.; Baynes, P.A.

    1995-06-01

    In response to a request from Mr. Thomas Grumbly, Assistant Secretary of Energy for Environmental Management, the Hanford Site contractors developed a conceptual set of risk-based cleanup strategies that (1) protect the public, workers, and environment from unacceptable risks; (2) are executable technically; and (3) fit within an expected annual funding profile of 1.05 billion dollars. These strategies were developed because (1) the US Department of Energy and Hanford Site budgets are being reduced, (2) stakeholders are dissatisfied with the perceived rate of cleanup, (3) the US Congress and the US Department of Energy are increasingly focusing on risk and riskreduction activities, (4) the present strategy is not integrated across the Site and is inconsistent in its treatment of similar hazards, (5) the present cleanup strategy is not cost-effective from a risk-reduction or future land use perspective, and (6) the milestones and activities in the Tri-Party Agreement cannot be achieved with an anticipated funding of 1.05 billion dollars annually. The risk-based strategies described herein were developed through a systems analysis approach that (1) analyzed the cleanup mission; (2) identified cleanup objectives, including risk reduction, land use, and mortgage reduction; (3) analyzed the existing baseline cleanup strategy from a cost and risk perspective; (4) developed alternatives for accomplishing the cleanup mission; (5) compared those alternatives against cleanup objectives; and (6) produced conclusions and recommendations regarding the current strategy and potential risk-based strategies.

  14. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  15. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  16. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  17. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  18. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  19. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  20. 78 FR 76521 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of Governors of the Federal...) is adopting a final rule that revises its market risk capital rule (market risk rule) to address... Cooperation and Development (OECD), which are referenced in the Board's market risk rule; to clarify...

  1. 78 FR 43829 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... CFR Parts 208 and 225 RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of... Governors of the Federal Reserve System (Board) proposes to revise its market risk capital rule (market risk... Organization for Economic Cooperation and Development (OECD), which are referenced in the Board's market...

  2. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk... Oversight. SBA supervises, examines, and regulates, and enforces laws against, SBA Supervised Lenders...

  3. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  4. Is probabilistic evidence a source of knowledge?

    PubMed

    Friedman, Ori; Turri, John

    2015-07-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B). Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by "judgment and decision making" researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case (Experiments 3A and 3B).

  5. Risk-based prioritization methodology for the classification of groundwater pollution sources.

    PubMed

    Pizzol, Lisa; Zabeo, Alex; Critto, Andrea; Giubilato, Elisa; Marcomini, Antonio

    2015-02-15

    Water management is one of the EU environmental priorities and it is one of the most serious challenges that today's major cities are facing. The main European regulation for the protection of water resources is represented by the Water Framework Directive (WFD) and the Groundwater Directive (2006/118/EC) which require the identification, risk-based ranking and management of sources of pollution and the identification of those contamination sources that threaten the achievement of groundwater's good quality status. The aim of this paper is to present a new risk-based prioritization methodology to support the determination of a management strategy for the achievement of the good quality status of groundwater. The proposed methodology encompasses the following steps: 1) hazard analysis, 2) pathway analysis, 3) receptor vulnerability analysis and 4) relative risk estimation. Moreover, by integrating GIS functionalities and Multi Criteria Decision Analysis (MCDA) techniques, it allows to: i) deal with several sources and multiple impacted receptors within the area of concern; ii) identify different receptors' vulnerability levels according to specific groundwater uses; iii) assess the risks posed by all contamination sources in the area; and iv) provide a risk-based ranking of the contamination sources that can threaten the achievement of the groundwater good quality status. The application of the proposed framework to a well-known industrialized area located in the surroundings of Milan (Italy) is illustrated in order to demonstrate the effectiveness of the proposed framework in supporting the identification of intervention priorities. Among the 32 sources analyzed in the case study, three sources received the highest relevance score, due to the medium-high relative risks estimated for Chromium (VI) and Perchloroethylene. The case study application showed that the developed methodology is flexible and easy to adapt to different contexts, thanks to the possibility to

  6. A Risk-Based Approach to Test and Evaluation

    DTIC Science & Technology

    2012-05-01

    is it to occur (probability, frequency), and what will be the outcome (consequences)? The SAPHIRE software tool also is introduced as a way to...develop those risk concepts dealing with event trees, fault trees, and desired end states. SAPHIRE is a probabilistic risk, and reliability assessment...software tool. SAPHIRE stands for Systems Analysis Programs for Hands-on Integrated Reliability Evaluations and was developed for the U.S. Nuclear

  7. Fuzzy logic and a risk-based graded approach for developing S/RIDs: An introduction

    SciTech Connect

    Wayland, J.R.

    1996-01-01

    A Standards/Requirements Identification Document (S/RID) is the set of expressed performance expectations, or standards, for a facility. Critical to the development of an integrated standards-based management is the identification of a set of necessary and sufficient standards from a selected set of standards/requirements. There is a need for a formal, rigorous selection process for the S/RIDs. This is the first of three reports that develop a fuzzy logic selection process. In this report the fundamentals of fuzzy logic are discussed as they apply to a risk-based graded approach.

  8. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  9. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  10. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  11. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  12. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  13. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  14. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  15. Probabilistic direct counterfactual quantum communication

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng

    2017-02-01

    It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).

  16. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  17. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  18. A pilot application of risk-based methods to establish in-service inspection priorities for nuclear components at Surry Unit 1 Nuclear Power Station

    SciTech Connect

    Vo, T.; Gore, B.; Simonen, F.; Doctor, S.

    1994-08-01

    As part of the Nondestructive Evaluation Reliability Program sponsored by the US Nuclear Regulatory Commission, the Pacific Northwest Laboratory is developing a method that uses risk-based approaches to establish in-service inspection plans for nuclear power plant components. This method uses probabilistic risk assessment (PRA) results and Failure Modes and Effects Analysis (FEMA) techniques to identify and prioritize the most risk-important systems and components for inspection. The Surry Nuclear Power Station Unit 1 was selected for pilot applications of this method. The specific systems addressed in this report are the reactor pressure vessel, the reactor coolant, the low-pressure injection, and the auxiliary feedwater. The results provide a risk-based ranking of components within these systems and relate the target risk to target failure probability values for individual components. These results will be used to guide the development of improved inspection plans for nuclear power plants. To develop inspection plans, the acceptable level of risk from structural failure for important systems and components will be apportioned as a small fraction (i.e., 5%) of the total PRA-estimated risk for core damage. This process will determine target (acceptable) risk and target failure probability values for individual components. Inspection requirements will be set at levels to assure that acceptable failure probabilistics are maintained.

  19. Risk-based decision making for terrorism applications.

    PubMed

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  20. Risk-based approach to petroleum hydrocarbon remediation. Research study

    SciTech Connect

    Miller, R.N.; Haas, P.; Faile, M.; Taffinder, S.

    1994-12-31

    The risk-based approach utilizes tools developed under the BTEX, Intrinsic Remediation (natural attenuation), Bioslurper, and Bioventing Initiatives of the Air Force Center for Environmental Excellence Technology Transfer Division (AFCEE/ERT) to construct a risk-based cost-effective approach to the cleanup of petroleum contaminated sites. The AFCEE Remediation Matrix (Enclosure 1) identifies natural attenuation as the first remediation alternative for soil and ground water contaminated with petroleum hydrocarbons. The intrinsic remediation (natural attenuation) alternative requires a scientifically defensible risk assessment based on contaminant sources, pathways, and receptors. For fuel contaminated sites, the first step is to determine contaminants of interest. For the ground water pathway (usually considered most important by regulators), this will normally be the most soluble, mobile, and toxic compounds, namely benzene, toluene, ethyl benzene, and o, m, p, xylene (BTEX).

  1. Coupling risk-based remediation with innovative technology

    SciTech Connect

    Goodheart, G.F.; Teaf, C.M. |; Manning, M.J.

    1998-05-01

    Tiered risk-based cleanup approaches have been effectively used at petroleum sites, pesticide sites and other commercial/industrial facilities. For example, the Illinois Environmental Protection Agency (IEPA) has promulgated guidance for a Tiered Approach to Corrective action Objectives (TACO) to establish site-specific remediation goals for contaminated soil and groundwater. As in the case of many other state programs, TACO is designed to provide for adequate protection of human health and the environment based on potential risks posed by site conditions. It also incorporates site-related information that may allow more cost-effective remediation. IEPA developed TACO to provide flexibility to site owners/operators when formulating site-specific remediation activities, as well as to hasten property redevelopment to return sites to more productive use. Where appropriate, risk-based cleanup objectives as set by TACO-type programs may be coupled with innovative remediation technologies such as air sparging, bioremediation and soil washing.

  2. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-07

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated.

  3. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  5. Increasing Effectiveness and Efficiency Through Risk-Based Deployments

    DTIC Science & Technology

    2015-12-01

    research into the experiences of other entities with risk-based deployment methodologies. 14. SUBJECT TERMS aviation security, Transportation...imaging technology BDO behavior detection officer CATA civil aviation threat assessment CHDS Center for Homeland Defense and Security COTS...little to no threat to aviation .”1 The TSA has the opportunity to continue this evolution, and address calls from the Government Accountability Office

  6. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  7. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  8. Decision making in flood risk based storm sewer network design.

    PubMed

    Sun, S A; Djordjević, S; Khu, S T

    2011-01-01

    It is widely recognised that flood risk needs to be taken into account when designing a storm sewer network. Flood risk is generally a combination of flood consequences and flood probabilities. This paper aims to explore the decision making in flood risk based storm sewer network design. A multiobjective optimization is proposed to find the Pareto front of optimal designs in terms of low construction cost and low flood risk. The decision making process then follows this multi-objective optimization to select a best design from the Pareto front. The traditional way of designing a storm sewer system based on a predefined design storm is used as one of the decision making criteria. Additionally, three commonly used risk based criteria, i.e., the expected flood risk based criterion, the Hurwicz criterion and the stochastic dominance based criterion, are investigated and applied in this paper. Different decisions are made according to different criteria as a result of different concerns represented by the criteria. The proposed procedure is applied to a simple storm sewer network design to demonstrate its effectiveness and the different criteria are compared.

  9. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  10. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills

    NASA Astrophysics Data System (ADS)

    Yang, Yu; Jiang, Yong-Hai; lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  11. Risk-based analyses in support of California hazardous site remediation

    SciTech Connect

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year`s activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs` capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis.

  12. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  13. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  14. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  15. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  16. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  17. Probabilistic regularization in inverse optical imaging.

    PubMed

    De Micheli, E; Viano, G A

    2000-11-01

    The problem of object restoration in the case of spatially incoherent illumination is considered. A regularized solution to the inverse problem is obtained through a probabilistic approach, and a numerical algorithm based on the statistical analysis of the noisy data is presented. Particular emphasis is placed on the question of the positivity constraint, which is incorporated into the probabilistically regularized solution by means of a quadratic programming technique. Numerical examples illustrating the main steps of the algorithm are also given.

  18. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  19. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  20. Parallel and Distributed Systems for Probabilistic Reasoning

    DTIC Science & Technology

    2012-12-01

    High-Level Abstractions . . . . . . . . . . . . . . . . . . . . . . . . . . . 154 8 Future Work 156 8.1 Scalable Online Probabilistic Reasoning...this chapter can be obtained from our online repository at http://gonzalezlabs/thesis. 3.1 Belief Propagation A core operation in probabilistic...models is not strictly novel. In the setting of online inference in Russell and Norvig [1995] used the notion of Fixed Lag Smoothing to eliminate the

  1. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    PubMed

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.

  2. Application of risk-based methods to inservice testing of check valves

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; McAllister, W.J.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice testing (IST) programs for pumps and valves in nuclear steam supply systems. This paper discusses a pilot application of these methods to the inservice testing of check valves in the emergency core cooling system of Georgia Power`s Vogtle nuclear power station. The results of the probabilistic safety assessment (PSA) are used to divide the check valves into risk-significant and less-risk-significant groups. This information is reviewed by a plant expert panel along with the consideration of appropriate deterministic insights to finally categorize the check valves into more safety-significant and less safety-significant component groups. All of the more safety-significant check valves are further evaluated in detail using a failure modes and causes analysis (FMCA) to assist in defining effective IST strategies. A template has been designed to evaluate how effective current and emerging tests for check valves are in detecting failures or in finding significant conditions that are precursors to failure for the likely failure causes. This information is then used to design and evaluate appropriate IST strategies that consider both the test method and frequency. A few of the less safety-significant check valves are also evaluated using this process since differences exist in check valve design, function, and operating conditions. Appropriate test strategies are selected for each check valve that has been evaluated based on safety and cost considerations. Test strategies are inferred from this information for the other check valves based on similar check valve conditions. Sensitivity studies are performed using the PSA model to arrive at an overall IST program that maintains or enhances safety at the lowest achievable cost.

  3. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Thompson, Julie; Leclaire, Rene; Edward, Bryan; Jones, Edward

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by an integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.

  4. DEVELOPMENT OF RISK-BASED AND TECHNOLOGY-INDEPENDENT SAFETY CRITERIA FOR GENERATION IV SYSTEMS

    SciTech Connect

    William E. Kastenberg; Edward Blandford; Lance Kim

    2009-03-31

    This project has developed quantitative safety goals for Generation IV (Gen IV) nuclear energy systems. These safety goals are risk based and technology independent. The foundations for a new approach to risk analysis has been developed, along with a new operational definition of risk. This project has furthered the current state-of-the-art by developing quantitative safety goals for both Gen IV reactors and for the overall Gen IV nuclear fuel cycle. The risk analysis approach developed will quantify performance measures, characterize uncertainty, and address a more comprehensive view of safety as it relates to the overall system. Appropriate safety criteria are necessary to manage risk in a prudent and cost-effective manner. This study is also important for government agencies responsible for managing, reviewing, and for approving advanced reactor systems because they are charged with assuring the health and safety of the public.

  5. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  6. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  7. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  8. Probabilistic elastography: estimating lung elasticity.

    PubMed

    Risholm, Petter; Ross, James; Washko, George R; Wells, William M

    2011-01-01

    We formulate registration-based elastography in a probabilistic framework and apply it to study lung elasticity in the presence of emphysematous and fibrotic tissue. The elasticity calculations are based on a Finite Element discretization of a linear elastic biomechanical model. We marginalize over the boundary conditions (deformation) of the biomechanical model to determine the posterior distribution over elasticity parameters. Image similarity is included in the likelihood, an elastic prior is included to constrain the boundary conditions, while a Markov model is used to spatially smooth the inhomogeneous elasticity. We use a Markov Chain Monte Carlo (MCMC) technique to characterize the posterior distribution over elasticity from which we extract the most probable elasticity as well as the uncertainty of this estimate. Even though registration-based lung elastography with inhomogeneous elasticity is challenging due the problem's highly underdetermined nature and the sparse image information available in lung CT, we show promising preliminary results on estimating lung elasticity contrast in the presence of emphysematous and fibrotic tissue.

  9. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  10. Optimal probabilistic dense coding schemes

    NASA Astrophysics Data System (ADS)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  11. Probabilistic description of traffic flow

    NASA Astrophysics Data System (ADS)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.

  12. Symbolic representation of probabilistic worlds.

    PubMed

    Feldman, Jacob

    2012-04-01

    Symbolic representation of environmental variables is a ubiquitous and often debated component of cognitive science. Yet notwithstanding centuries of philosophical discussion, the efficacy, scope, and validity of such representation has rarely been given direct consideration from a mathematical point of view. This paper introduces a quantitative measure of the effectiveness of symbolic representation, and develops formal constraints under which such representation is in fact warranted. The effectiveness of symbolic representation hinges on the probabilistic structure of the environment that is to be represented. For arbitrary probability distributions (i.e., environments), symbolic representation is generally not warranted. But in modal environments, defined here as those that consist of mixtures of component distributions that are narrow ("spiky") relative to their spreads, symbolic representation can be shown to represent the environment with a relatively negligible loss of information. Modal environments support propositional forms, logical relations, and other familiar features of symbolic representation. Hence the assumption that our environment is, in fact, modal is a key tacit assumption underlying the use of symbols in cognitive science.

  13. Dynamical systems probabilistic risk assessment

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  14. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  15. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  16. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  17. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently.

  18. Toward a Safety Risk-Based Classification of Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Torres-Pomalas, Wilfredo

    2016-01-01

    There is a trend of growing interest and demand for greater access of unmanned aircraft (UA) to the National Airspace System (NAS) as the ongoing development of UA technology has created the potential for significant economic benefits. However, the lack of a comprehensive and efficient UA regulatory framework has constrained the number and kinds of UA operations that can be performed. This report presents initial results of a study aimed at defining a safety-risk-based UA classification as a plausible basis for a regulatory framework for UA operating in the NAS. Much of the study up to this point has been at a conceptual high level. The report includes a survey of contextual topics, analysis of safety risk considerations, and initial recommendations for a risk-based approach to safe UA operations in the NAS. The next phase of the study will develop and leverage deeper clarity and insight into practical engineering and regulatory considerations for ensuring that UA operations have an acceptable level of safety.

  19. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  20. Risk-based selection of SSCs at Peach Bottom

    SciTech Connect

    Krueger, G.A.; Marie, A.J. )

    1993-01-01

    The purpose of identifying risk significant systems, structures, and components (SSCS) that are within the scope of the maintenance rule is to bring a higher level of attention to a subset of those SSCS. These risk-significant SSCs will have specific performance criteria established for them, and failure to meet this performance criteria will result in establishing goals to ensure the necessary improvement in performance. The Peach Bottom individual plant examination (IPE) results were used to provide insights for the verification of proposed probabilistic risk assessment (PRA) methods set forth in the Industry Maintenance Guidelines for Implementation of the Maintenance Rule. The objective of reviewing the methods for selection of SSCs that are considered risk significant was to ensure the methods used are logical, reproducible, and can be consistently applied.

  1. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  2. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  3. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  4. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  5. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  6. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  7. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  8. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  9. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  10. bayesPop: Probabilistic Population Projections

    PubMed Central

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  11. bayesPop: Probabilistic Population Projections.

    PubMed

    Ševčíková, Hana; Raftery, Adrian E

    2016-12-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  12. A probabilistic approach to spectral graph matching.

    PubMed

    Egozi, Amir; Keller, Yosi; Guterman, Hugo

    2013-01-01

    Spectral Matching (SM) is a computationally efficient approach to approximate the solution of pairwise matching problems that are np-hard. In this paper, we present a probabilistic interpretation of spectral matching schemes and derive a novel Probabilistic Matching (PM) scheme that is shown to outperform previous approaches. We show that spectral matching can be interpreted as a Maximum Likelihood (ML) estimate of the assignment probabilities and that the Graduated Assignment (GA) algorithm can be cast as a Maximum a Posteriori (MAP) estimator. Based on this analysis, we derive a ranking scheme for spectral matchings based on their reliability, and propose a novel iterative probabilistic matching algorithm that relaxes some of the implicit assumptions used in prior works. We experimentally show our approaches to outperform previous schemes when applied to exhaustive synthetic tests as well as the analysis of real image sequences.

  13. Probabilistic Cue Combination: Less is More

    PubMed Central

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2012-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the dilution effect, predictions made from the combination of two cues of different strengths are less accurate than those made from the stronger cue alone. Here we show that dilution is an adult problem; 11-month-old infants combine strong and weak predictors normatively. These results extend and add support for the less is more hypothesis: limited cognitive resources can lead children to represent probabilistic information differently from adults, and this difference in representation can have important downstream consequences for prediction. PMID:23432826

  14. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  15. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  16. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  17. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  18. Homeland security R&D roadmapping : risk-based methodological options.

    SciTech Connect

    Brandt, Larry D.

    2008-12-01

    The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

  19. Risk-based decision-making framework for the selection of sediment dredging option.

    PubMed

    Manap, Norpadzlihatun; Voulvoulis, Nikolaos

    2014-10-15

    The aim of this study was to develop a risk-based decision-making framework for the selection of sediment dredging option. Descriptions using case studies of the newly integrated, holistic and staged framework were followed. The first stage utilized the historical dredging monitoring data and the contamination level in media data into Ecological Risk Assessment phases, which have been altered for benefits in cost, time and simplicity. How Multi-Criteria Decision Analysis (MCDA) can be used to analyze and prioritize dredging areas based on environmental, socio-economic and managerial criteria was described for the next stage. The results from MCDA will be integrated into Ecological Risk Assessment to characterize the degree of contamination in the prioritized areas. The last stage was later described using these findings and analyzed using MCDA, in order to identify the best sediment dredging option, accounting for the economic, environmental and technical aspects of dredging, which is beneficial for dredging and sediment management industries.

  20. Nuclear insurance risk assessment using risk-based methodology

    SciTech Connect

    Wendland, W.G. )

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance.

  1. On the optimal risk based design of highway drainage structures

    NASA Astrophysics Data System (ADS)

    Tung, Y.-K.; Bao, Y.

    1990-12-01

    For a proposed highway bridge or culvert, the total cost to the public during its expected service life includes capital investment on the structures, regular operation and maintenance costs, and various flood related costs. The flood related damage costs include items such as replacement and repair costs of the highway bridge or culvert, flood plain property damage costs, users costs from traffic interruptions and detours, and others. As the design discharge increases, the required capital investment increases but the corresponding flood related damage costs decrease. Hydraulic design of a bridge or culvert using a riskbased approach is to choose among the alternatives the one associated with the least total expected cost. In this paper, the risk-based design procedure is applied to pipe culvert design. The effect of the hydrologic uncertainties such as sample size and type of flood distribution model on the optimal culvert design parameters including design return period and total expected cost are examined in this paper.

  2. Why are probabilistic laws governing quantum mechanics and neurobiology?

    NASA Astrophysics Data System (ADS)

    Kröger, Helmut

    2005-08-01

    We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.

  3. Towards Probabilistic Modelling in Event-B

    NASA Astrophysics Data System (ADS)

    Tarasyuk, Anton; Troubitsyna, Elena; Laibinis, Linas

    Event-B provides us with a powerful framework for correct-by-construction system development. However, while developing dependable systems we should not only guarantee their functional correctness but also quantitatively assess their dependability attributes. In this paper we investigate how to conduct probabilistic assessment of reliability of control systems modeled in Event-B. We show how to transform an Event-B model into a Markov model amendable for probabilistic reliability analysis. Our approach enables integration of reasoning about correctness with quantitative analysis of reliability.

  4. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  5. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2000-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  6. The probabilistic approach to human reasoning.

    PubMed

    Oaksford, M; Chater, N

    2001-08-01

    A recent development in the cognitive science of reasoning has been the emergence of a probabilistic approach to the behaviour observed on ostensibly logical tasks. According to this approach the errors and biases documented on these tasks occur because people import their everyday uncertain reasoning strategies into the laboratory. Consequently participants' apparently irrational behaviour is the result of comparing it with an inappropriate logical standard. In this article, we contrast the probabilistic approach with other approaches to explaining rationality, and then show how it has been applied to three main areas of logical reasoning: conditional inference, Wason's selection task and syllogistic reasoning.

  7. Impact of Probabilistic Weather on Flight Routing Decisions

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel

    2006-01-01

    Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a

  8. Probabilistic model for bridge structural evaluation using nondestructive inspection data

    NASA Astrophysics Data System (ADS)

    Carrion, Francisco; Lopez, Jose Alfredo; Balankin, Alexander

    2005-05-01

    A bridge management system developed for the Mexican toll highway network applies a probabilistic-reliability model to estimate load capacity and structural residual life. Basic inputs for the system are the global inspection data (visual inspections and vibration testing), and the information from the environment conditions (weather, traffic, loads, earthquakes); although, the model takes account for additional non-destructive testing or permanent monitoring data. Main outputs are the periodic maintenance, rehabilitation and replacement program, and the updated inspection program. Both programs are custom-made to available funds and scheduled according to a priority assignation criterion. The probabilistic model, tailored to typical bridges, accounts for the size, age, material and structure type. Special bridges in size or type may be included, while in these cases finite element deterministic models are also possible. Key feature is that structural qualification is given in terms of the probability of failure, calculated considering fundamental degradation mechanisms and from actual direct observations and measurements, such as crack distribution and size, materials properties, bridge dimensions, load deflections, and parameters for corrosion evaluation. Vibration measurements are basically used to infer structural resistance and to monitor long term degradation.

  9. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  10. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  11. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  12. Simulating spatial and temporal varying CO2 signals from sources at the seafloor to help designing risk-based monitoring programs

    NASA Astrophysics Data System (ADS)

    Ali, Alfatih; Frøysa, Hâvard G.; Avlesen, Helge; Alendal, Guttorm

    2016-01-01

    Risk-based monitoring requires quantification of the probability of the design to detect the potentially adverse events. A component in designing the monitoring program will be to predict the varying signal caused by an event, here detection of a gas seep through the seafloor from an unknown location. The Bergen Ocean Model (BOM) is used to simulate dispersion of CO2 leaking from different locations in the North Sea, focusing on temporal and spatial variability of the CO2 concentration. It is shown that the statistical footprint depends on seep location and that this will have to be accounted for in designing a network of sensors with highest probability of detecting a seep. As a consequence, heterogeneous probabilistic predictions of CO2 footprints should be available to subsea geological CO2 storage projects in order to meet regulations.

  13. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  14. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  15. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  16. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  17. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  18. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  19. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  20. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  1. Emergency feedwater system risk-based inspection guide for the Arkansas Nuclear One Unit 2 Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.

    1992-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses-existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Arkansas Nuclear One Unit 2 was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRS. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Arkansas Nuclear One Unit 2 plant.

  2. Auxiliary feedwater system risk-based inspection guide for the Beaver Valley, Units 1 and 2 nuclear power plants

    SciTech Connect

    Lloyd, R.C.; Vehec, T.A.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.; Rossbach, L.W.; Sena, P.P. III

    1993-02-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Beaver Valley Units 1 and 2 were selected as two of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at Beaver Valley Units 1 and 2.

  3. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  4. A risk-based approach to land-use planning.

    PubMed

    Hauptmanns, Ulrich

    2005-10-17

    The Seveso II-Directive requires that the objectives of preventing major accidents and limiting their consequences be taken into account by the Member States in their land-use policies and/or other relevant policies. This is to be achieved by ensuring adequate distances between industrial establishments and residential areas, areas of public use and areas of particular natural sensitivity or interest. A risk-based framework implemented in a computer program is presented which enables one to calculate adequate distances. The criterion used is a limit on the individual risk of death. The method is a simplified risk analysis which represents the plant, whose characteristics are normally unknown at the stage of land-use planning, by generic frequencies of release for process units and storage tanks. Their number depends on the size of the site to be allotted. The procedure is capable of addressing the siting of new establishments and, with due regard to the simplifications used, modifications to and new developments in the vicinity of existing establishments. Given the numerous assumptions, which have to be made, the framework represents a convention.

  5. Guidelines for risk-based prioritization of DOE activities

    SciTech Connect

    1998-04-01

    This standard describes issues that should be considered when comparing, selecting, or implementing risk-based prioritization (RBP) systems. It also discusses characteristics that should be used in evaluating the quality of an RBP system and its associated results. The purpose of this standard is to provide guidance for selecting or developing an RBP system so that when implemented, it will: (a) improve the quality of the RBP systems employed by DOE and its contractors; (b) improve the consistency and comparability of RBP system results; (c) satisfy DOE requests to perform RBP; (d) help ensure that limited resources are used efficiently and effectively; (e) help ensure that characteristics for evaluating RBP systems are met and properly balanced; (f) promote greater understanding, use, and acceptance of RBP systems; (g) promote greater understanding between DOE and its stakeholders and regulators; (h) improve the quality of resource allocation, planning, and scheduling decisions. This standard is applicable to any and all uses of RBP by DOE elements, including cases in which RBP is requested by DOE or is used to help allocate resources among alternatives that compete for resources. Prioritizing alternatives that compete for limited resources encompasses many policy issues that are inherent to an RBP effort. It is the position of this standard that policy issues should be determined by the decision maker(s) requesting the prioritization. For additional information on policy issues, refer to section 10 on Application Guidance for Policy Issues.

  6. Risk-based maintenance of ethylene oxide production facilities.

    PubMed

    Khan, Faisal I; Haddara, Mahmoud R

    2004-05-20

    This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.

  7. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  8. Risk Classification and Risk-based Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  9. Developing a risk-based air quality health index

    NASA Astrophysics Data System (ADS)

    Wong, Tze Wai; Tam, Wilson Wai San; Yu, Ignatius Tak Sun; Lau, Alexis Kai Hon; Pang, Sik Wing; Wong, Andromeda H. S.

    2013-09-01

    We developed a risk-based, multi-pollutant air quality health index (AQHI) reporting system in Hong Kong, based on the Canadian approach. We performed time series studies to obtain the relative risks of hospital admissions for respiratory and cardiovascular diseases associated with four air pollutants: sulphur dioxide, nitrogen dioxide, ozone, and particulate matter with an aerodynamic diameter less than 10 μm (PM10). We then calculated the sum of excess risks of the hospital admissions associated with these air pollutants. The cut-off points of the summed excess risk, for the issuance of different health warnings, were based on the concentrations of these pollutants recommended as short-term Air Quality Guidelines by the World Health Organization. The excess risks were adjusted downwards for young children and the elderly. Health risk was grouped into five categories and sub-divided into eleven bands, with equal increments in excess risk from band 1 up to band 10 (the 11th band is 'band 10+'). We developed health warning messages for the general public, including at-risk groups: young children, the elderly, and people with pre-existing cardiac or respiratory diseases. The new system addressed two major shortcomings of the current standard-based system; namely, the time lag between a sudden rise in air pollutant concentrations and the issue of a health warning, and the reliance on one dominant pollutant to calculate the index. Hence, the AQHI represents an improvement over Hong Kong's existing air pollution index.

  10. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  11. Selecting a risk-based tool to aid in decision making

    SciTech Connect

    Bendure, A.O.

    1995-03-01

    Selecting a risk-based tool to aid in decision making is as much of a challenge as properly using the tool once it has been selected. Failure to consider customer and stakeholder requirements and the technical bases and differences in risk-based decision making tools will produce confounding and/or politically unacceptable results when the tool is used. Selecting a risk-based decisionmaking tool must therefore be undertaken with the same, if not greater, rigor than the use of the tool once it is selected. This paper presents a process for selecting a risk-based tool appropriate to a set of prioritization or resource allocation tasks, discusses the results of applying the process to four risk-based decision-making tools, and identifies the ``musts`` for successful selection and implementation of a risk-based tool to aid in decision making.

  12. How probabilistic risk assessment can mislead terrorism risk analysts.

    PubMed

    Brown, Gerald G; Cox, Louis Anthony Tony

    2011-02-01

    Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do.

  13. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  14. Probabilistic Structural Health Monitoring of the Orbiter Wing Leading Edge

    NASA Technical Reports Server (NTRS)

    Yap, Keng C.; Macias, Jesus; Kaouk, Mohamed; Gafka, Tammy L.; Kerr, Justin H.

    2011-01-01

    A structural health monitoring (SHM) system can contribute to the risk management of a structure operating under hazardous conditions. An example is the Wing Leading Edge Impact Detection System (WLEIDS) that monitors the debris hazards to the Space Shuttle Orbiter s Reinforced Carbon-Carbon (RCC) panels. Since Return-to-Flight (RTF) after the Columbia accident, WLEIDS was developed and subsequently deployed on board the Orbiter to detect ascent and on-orbit debris impacts, so as to support the assessment of wing leading edge structural integrity prior to Orbiter re-entry. As SHM is inherently an inverse problem, the analyses involved, including those performed for WLEIDS, tend to be associated with significant uncertainty. The use of probabilistic approaches to handle the uncertainty has resulted in the successful implementation of many development and application milestones.

  15. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  16. A Risk-Based Ecohydrological Approach to Assessing Environmental Flow Regimes.

    PubMed

    Mcgregor, Glenn B; Marshall, Jonathan C; Lobegeiger, Jaye S; Holloway, Dean; Menke, Norbert; Coysh, Julie

    2017-03-27

    For several decades there has been recognition that water resource development alters river flow regimes and impacts ecosystem values. Determining strategies to protect or restore flow regimes to achieve ecological outcomes is a focus of water policy and legislation in many parts of the world. However, consideration of existing environmental flow assessment approaches for application in Queensland identified deficiencies precluding their adoption. Firstly, in managing flows and using ecosystem condition as an indicator of effectiveness, many approaches ignore the fact that river ecosystems are subjected to threatening processes other than flow regime alteration. Secondly, many focus on providing flows for responses without considering how often they are necessary to sustain ecological values in the long-term. Finally, few consider requirements at spatial-scales relevant to the desired outcomes, with frequent focus on individual places rather than the regions supporting sustainability. Consequently, we developed a risk-based ecohydrological approach that identifies ecosystem values linked to desired ecological outcomes, is sensitive to flow alteration and uses indicators of broader ecosystem requirements. Monitoring and research is undertaken to quantify flow-dependencies and ecological modelling is used to quantify flow-related ecological responses over an historical flow period. The relative risk from different flow management scenarios can be evaluated at relevant spatial-scales. This overcomes the deficiencies identified above and provides a robust and useful foundation upon which to build the information needed to support water planning decisions. Application of the risk assessment approach is illustrated here by two case studies.

  17. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  18. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  19. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  20. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  1. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  2. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  3. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  4. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  5. Probabilistic Assessment of National Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M.; Chamis, C. C.

    1996-01-01

    A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.

  6. Significance testing as perverse probabilistic reasoning

    PubMed Central

    2011-01-01

    Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference. PMID:21356064

  7. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  8. Bayesian Probabilistic Projection of International Migration.

    PubMed

    Azose, Jonathan J; Raftery, Adrian E

    2015-10-01

    We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.

  9. Probabilistically teleporting arbitrary two-qubit states

    NASA Astrophysics Data System (ADS)

    Choudhury, Binayak S.; Dhara, Arpan

    2016-12-01

    In this paper we make use of two non-maximally entangled three-qubit channels for probabilistically teleporting arbitrary two particle states from a sender to a receiver. We also calculate the success probability of the teleportation. In the protocol we use two measurements of which one is a POVM and the other is a projective measurement. The POVM provides the protocol with operational advantage.

  10. Probabilistic Anisotropic Failure Criteria for Composite Materials.

    DTIC Science & Technology

    1987-12-01

    worksheets were based on Microsoft Excel software. 55 55 ’. 2.’ 𔃼..’. -.. ’-,’€’.’’.’ :2.,2..’..’.2.’.’.,’.." .𔃼.. .2...analytically described the failure cri - terion and probabilistic failure states of a anisotropic composite in a combined stress state. Strength...APPENDIX F RELIABILITY/FAILURE FUNCTION WORKSHEET ........... 76 APPENDIX G PERCENTILE STRENGTH WORKSHEET ....................... 80 LIST OF

  11. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  12. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  13. Probabilistic Network Approach to Decision-Making

    NASA Astrophysics Data System (ADS)

    Nicolis, Grégoire; Nicolis, Stamatios C.

    2015-06-01

    A probabilistic approach to decision-making is developed in which the states of the underlying stochastic process, assumed to be of the Markov type, represent the competing options. The principal parameters determining the dominance of a particular option versus the others are identified and the transduction of information associated to the transitions between states is quantified using a set of entropy-like quantities.

  14. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  15. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  16. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.

  17. Probabilistic Graph Layout for Uncertain Network Visualization.

    PubMed

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  18. Probabilistic Modeling of Space Shuttle Debris Impact

    NASA Technical Reports Server (NTRS)

    Huyse, Luc J.; Asce, M.; Waldhart, Chris J.; Riha, David S.; Larsen, Curtis E.; Gomez, Reynaldo J.; Stuart, Phillip C.

    2007-01-01

    On Feb 1, 2003, the Shuttle Columbia was lost during its return to Earth. As a result of the conclusion that debris impact caused the damage to the left wing of the Columbia Space Shuttle Vehicle (SSV) during ascent, the Columbia Accident Investigation Board recommended that an assessment be performed of the debris environment experienced by the SSV during ascent. A flight rationale based on probabilistic assessment is used for the SSV return-to-flight. The assessment entails identifying all potential debris sources, their probable geometric and aerodynamic characteristics, and their potential for impacting and damaging critical Shuttle components. A probabilistic analysis tool, based on the SwRI-developed NESSUS probabilistic analysis software, predicts the probability of impact and damage to the space shuttle wing leading edge and thermal protection system components. Among other parameters, the likelihood of unacceptable damage depends on the time of release (Mach number of the orbiter) and the divot mass as well as the impact velocity and impact angle. A typical result is visualized in the figures below. Probability of impact and damage, as well as the sensitivities thereof with respect to the distribution assumptions, can be computed and visualized at each point on the orbiter or summarized per wing panel or tile zone.

  19. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts.

  20. Multiclient Identification System Using Adaptive Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Siana, Linda; Shou, Yu-Wen; Yang, Chien-Ting

    2010-12-01

    This paper aims at integrating detection and identification of human faces in a more practical and real-time face recognition system. The proposed face detection system is based on the cascade Adaboost method to improve the precision and robustness toward unstable surrounding lightings. Our Adaboost method innovates to adjust the environmental lighting conditions by histogram lighting normalization and to accurately locate the face regions by a region-based-clustering process as well. We also address on the problem of multi-scale faces in this paper by using 12 different scales of searching windows and 5 different orientations for each client in pursuit of the multi-view independent face identification. There are majorly two methodological parts in our face identification system, including PCA (principal component analysis) facial feature extraction and adaptive probabilistic model (APM). The structure of our implemented APM with a weighted combination of simple probabilistic functions constructs the likelihood functions by the probabilistic constraint in the similarity measures. In addition, our proposed method can online add a new client and update the information of registered clients due to the constructed APM. The experimental results eventually show the superior performance of our proposed system for both offline and real-time online testing.

  1. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  2. A method for probabilistic flash flood forecasting

    NASA Astrophysics Data System (ADS)

    Hardy, Jill; Gourley, Jonathan J.; Kirstetter, Pierre-Emmanuel; Hong, Yang; Kong, Fanyou; Flamig, Zachary L.

    2016-10-01

    Flash flooding is one of the most costly and deadly natural hazards in the United States and across the globe. This study advances the use of high-resolution quantitative precipitation forecasts (QPFs) for flash flood forecasting. The QPFs are derived from a stormscale ensemble prediction system, and used within a distributed hydrological model framework to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Before creating the PFFFs, it is important to characterize QPF uncertainty, particularly in terms of location which is the most problematic for hydrological use of QPFs. The SAL methodology (Wernli et al., 2008), which stands for structure, amplitude, and location, is used for this error quantification, with a focus on location. Finally, the PFFF methodology is proposed that produces probabilistic hydrological forecasts. The main advantages of this method are: (1) identifying specific basin scales that are forecast to be impacted by flash flooding; (2) yielding probabilistic information about the forecast hydrologic response that accounts for the locational uncertainties of the QPFs; (3) improving lead time by using stormscale NWP ensemble forecasts; and (4) not requiring multiple simulations, which are computationally demanding.

  3. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How to report your risk-based capital determination. 652.90 Section 652.90 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.90 How...

  4. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Audit of the risk-based capital stress test. 652.100 Section 652.100 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.100 Audit...

  5. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Risk-based capital credit risk-weight... CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk...)(2) of this section), plus risk-weighted recourse obligations, direct credit substitutes, and...

  6. 12 CFR 652.80 - When you must determine the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false When you must determine the risk-based capital level. 652.80 Section 652.80 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.80 When...

  7. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... losses as support for the credit risk of all AMA estimated by the Bank to represent a credit risk that is...) Recalculation of credit enhancement. For risk-based capital purposes, each Bank shall recalculate the estimated... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement for...

  8. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false When to report the risk-based capital level. 652.85 Section 652.85 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.85 When...

  9. 12 CFR 652.75 - Your responsibility for determining the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Your responsibility for determining the risk-based capital level. 652.75 Section 652.75 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based...

  10. 12 CFR 956.4 - Risk-based capital requirement for investments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... credit risk of all investments that are not rated by an NRSRO, or are rated or have a putative rating... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement for investments... OFF-BALANCE SHEET ITEMS FEDERAL HOME LOAN BANK INVESTMENTS § 956.4 Risk-based capital requirement...

  11. 12 CFR 702.103 - Applicability of risk-based net worth requirement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Applicability of risk-based net worth... AFFECTING CREDIT UNIONS PROMPT CORRECTIVE ACTION Net Worth Classification § 702.103 Applicability of risk-based net worth requirement. For purposes of § 702.102, a credit union is defined as “complex” and...

  12. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  13. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  14. WE-B-BRC-00: Concepts in Risk-Based Assessment.

    PubMed

    Fraass, Benedick

    2016-06-01

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. We therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods.

  15. A qualitative phenomenological study: Enhanced, risk-based FAA oversight on part 145 maintenance practices

    NASA Astrophysics Data System (ADS)

    Sheehan, Bryan G.

    The purpose of this qualitative phenomenological study was to examine the phenomenon of enhanced, risk-based Federal Aviation Administration (FAA) oversight of Part 145 repair stations that performed aircraft maintenance for Part 121 air carriers between 2007 and 2014 in Oklahoma. Specifically, this research was utilized to explore what operational changes have occurred in the domestic Part 145 repair station industry such as variations in management or hiring practices, training, recordkeeping and technical data, inventory and aircraft parts supply-chain logistics, equipment, and facilities. After interviewing 12 managers from Part 145 repair stations in Oklahoma, six major theme codes emerged from the data: quality of oversight before 2007, quality of oversight after 2007, advantages of oversight, disadvantages of oversight, status quo of oversight, and process improvement . Of those six major theme codes, 17 subthemes appeared from the data that were used to explain the phenomenon of enhanced oversight in the Part 145 repair station industry. Forty-two percent of the participants indicated a weak FAA oversight system that has hindered the continuous process improvement program in their repair stations. Some of them were financially burdened after hiring additional full-time quality assurance inspectors to specifically manage enhanced FAA oversight. Notwithstanding, the participants of the study indicated that the FAA must apply its surveillance on a more standardized and consistent basis. They want to see this standardization in how FAA inspectors interpret regulations and practice the same quality of oversight for all repair stations, particularly those that are repeat violators and fail to comply with federal aviation regulations. They believed that when the FAA enforces standardization on a consistent basis, repair stations can become more efficient and safer in the performance of their scope of work for the U.S. commercial air transportation industry.

  16. Developing and evaluating distributions for probabilistic human exposure assessments

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.

    2002-08-01

    This report describes research carried out at the Lawrence Berkeley National Laboratory (LBNL) to assist the U. S. Environmental Protection Agency (EPA) in developing a consistent yet flexible approach for evaluating the inputs to probabilistic risk assessments. The U.S. EPA Office of Emergency and Remedial Response (OERR) recently released Volume 3 Part A of Risk Assessment Guidance for Superfund (RAGS), as an update to the existing two-volume set of RAGS. The update provides policy and technical guidance on performing probabilistic risk assessment (PRA). Consequently, EPA risk managers and decision-makers need to review and evaluate the adequacy of PRAs for supporting regulatory decisions. A critical part of evaluating a PRA is the problem of evaluating or judging the adequacy of input distributions PRA. Although the overarching theme of this report is the need to improve the ease and consistency of the regulatory review process, the specific objectives are presented in two parts. The objective of Part 1 is to develop a consistent yet flexible process for evaluating distributions in a PRA by identifying the critical attributes of an exposure factor distribution and discussing how these attributes relate to the task-specific adequacy of the input. This objective is carried out with emphasis on the perspective of a risk manager or decision-maker. The proposed evaluation procedure provides consistency to the review process without a loss of flexibility. As a result, the approach described in Part 1 provides an opportunity to apply a single review framework for all EPA regions and yet provide the regional risk manager with the flexibility to deal with site- and case-specific issues in the PRA process. However, as the number of inputs to a PRA increases, so does the complexity of the process for calculating, communicating and managing risk. As a result, there is increasing effort required of both the risk professionals performing the analysis and the risk manager

  17. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  18. An evaluation of the role of risk-based decision-making in a former manufactured gas plant site remediation.

    PubMed

    Vyas, Vikram M; Gochfeld, Michael G; Georgopoulos, Panos G; Lioy, Paul J; Sussman, Nancy R

    2006-02-01

    Environmental remediation decisions are driven by the need to minimize human health and ecological risks posed by environmental releases. The Risk Assessment Guidance for Superfund Sites enunciates the principles of exposure and risk assessment that are to be used for reaching remediation decisions for sites under Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). Experience with remediation management under CERCLA has led to recognition of some crucial infirmities in the processes for managing remediation: cleanup management policies are ad hoc in character, mandates and practices are strongly conservative, and contaminant risk management occurs in an artificially narrow context. The purpose of this case study is to show how a policy of risk-based decision-making was used to avoid customary pitfalls in site remediation. This case study describes the risk-based decision-making process in a remedial action program at a former manufactured gas plant site that successfully achieved timely and effective cleanup. The remediation process operated outside the confines of the CERCLA process under an administrative consent order between the utility and the New Jersey Department of Environmental Protection. A residential use end state was negotiated as part of this agreement. The attendant uncertainties, complications, and unexpected contingencies were overcome by using the likely exposures associated with the desired end state to structure all of the remediation management decisions and by collecting site-specific information from the very outset to obtain a detailed and realistic characterization of human health risks that needed to be mitigated. The lessons from this case study are generalizable to more complicated remediation cases, when supported by correspondingly sophisticated technical approaches.

  19. Probabilistic alternatives to Bayesianism: the case of explanationism

    PubMed Central

    Douven, Igor; Schupbach, Jonah N.

    2015-01-01

    There has been a probabilistic turn in contemporary cognitive science. Far and away, most of the work in this vein is Bayesian, at least in name. Coinciding with this development, philosophers have increasingly promoted Bayesianism as the best normative account of how humans ought to reason. In this paper, we make a push for exploring the probabilistic terrain outside of Bayesianism. Non-Bayesian, but still probabilistic, theories provide plausible competitors both to descriptive and normative Bayesian accounts. We argue for this general idea via recent work on explanationist models of updating, which are fundamentally probabilistic but assign a substantial, non-Bayesian role to explanatory considerations. PMID:25964769

  20. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment.

  1. Risk based in vitro performance assessment of extended release abuse deterrent formulations.

    PubMed

    Xu, Xiaoming; Gupta, Abhay; Al-Ghabeish, Manar; Calderon, Silvia N; Khan, Mansoor A

    2016-03-16

    High strength extended release opioid products, which are indispensable tools in the management of pain, are associated with serious risks of unintentional and potentially fatal overdose, as well as of misuse and abuse that might lead to addiction. The issue of drug abuse becomes increasingly prominent when the dosage forms can be readily manipulated to release a high amount of opioid or to extract the drug in certain products or solvents. One approach to deter opioid drug abuse is by providing novel abuse deterrent formulations (ADF), with properties that may be viewed as barriers to abuse of the product. However, unlike regular extended release formulations, assessment of ADF technologies are challenging, in part due to the great variety of formulation designs available to achieve deterrence of abuse by oral, parenteral, nasal and respiratory routes. With limited prior history or literature information, and lack of compendial standards, evaluation and regulatory approval of these novel drug products become increasingly difficult. The present article describes a risk-based standardized in-vitro approach that can be utilized in general evaluation of abuse deterrent features for all ADF products.

  2. Risk based In Vitro Performance Assessment of Extended Release Abuse Deterrent Formulations

    PubMed Central

    Xu, Xiaoming; Gupta, Abhay; Al-Ghabeish, Manar; Calderon, Silvia N.; Khan, Mansoor A.

    2016-01-01

    High strength extended release opioid products, which are indispensable tools in the management of pain, are associated with serious risks of unintentional and potentially fatal overdose, as well as of misuse and abuse that might lead to addiction. The issue of drug abuse becomes increasingly prominent when the dosage forms can be readily manipulated to release a high amount of opioid or to extract the drug in certain products or solvents. One approach to deter opioid drug abuse is by providing novel abuse deterrent formulations (ADF), with properties that may be viewed as barriers to abuse of the product. However, unlike regular extended release formulations, assessment of ADF technologies are challenging, in part due to the great variety of formulation designs available to achieve deterrence of abuse by oral, parenteral, nasal and respiratory routes. With limited prior history or literature information, and lack of compendial standards, evaluation and regulatory approval of these novel drug products become increasingly difficult. The present article describes a risk-based standardized in-vitro approach that can be utilized in general evaluation of abuse deterrent features for all ADF products. PMID:26784976

  3. Risk-Based Decision Process for Accelerated Closure of a Nuclear Weapons Facility

    SciTech Connect

    Butler, L.; Norland, R. L.; DiSalvo, R.; Anderson, M.

    2003-02-25

    Nearly 40 years of nuclear weapons production at the Rocky Flats Environmental Technology Site (RFETS or Site) resulted in contamination of soil and underground systems and structures with hazardous substances, including plutonium, uranium and hazardous waste constituents. The Site was placed on the National Priority List in 1989. There are more than 370 Individual Hazardous Substance Sites (IHSSs) at RFETS. Accelerated cleanup and closure of RFETS is being achieved through implementation and refinement of a regulatory framework that fosters programmatic and technical innovations: (1) extensive use of ''accelerated actions'' to remediate IHSSs, (2) development of a risk-based screening process that triggers and helps define the scope of accelerated actions consistent with the final remedial action objectives for the Site, (3) use of field instrumentation for real time data collection, (4) a data management system that renders near real time field data assessment, and (5) a regulatory agency consultative process to facilitate timely decisions. This paper presents the process and interim results for these aspects of the accelerated closure program applied to Environmental Restoration activities at the Site.

  4. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  5. Probabilistic design analysis using Composite Loads Spectra (CLS) coupled with Probabilistic Structural Analysis Methodologies (PSAM)

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H.

    1989-01-01

    Composite loads spectra (CLS) were applied to generate probabilistic loads for use in the PSAM nonlinear evaluation of stochastic structures under stress (NESSUS) finite element code. The CLS approach allows for quantifying loads as mean values and distributions around a central value rather than maximum or enveloped values typically used in deterministic analysis. NESSUS uses these loads to determine mean and perturbation responses. These results are probabilistically evaluated with the distributional information from CLS using a fast probabilistic integration (FPI) technique to define response distributions. The main example discussed describes a method of obtaining load descriptions and stress response of the second-stage turbine blade of the Space Shuttle Main Engine (SSME) high-pressure fuel turbopump (HPFTP). Additional information is presented on the on-going analysis of the high pressure oxidizer turbopump discharge duct (HPOTP) where probabilistic dynamic loads have been generated and are in the process of being used for dynamic analysis. Example comparisons of load analysis and engine data are furnished for partial verification and/or justification for the methodology.

  6. Heterogeneous Uncertainty Management

    DTIC Science & Technology

    2008-03-08

    probabilistic ( HTP ) agents, the concept of probabilistic version of XML and RDF, and probabilistic methods to reason about collections of moving objects. S...heterogeneous temporal probabilistic ( HTP ) agents, the concept of probabilistic version of XML and RDF, and probabilistic methods to reason about...temporal probabilistic ( HTP ) agent. HTP agents can build temporal probabilistic reasoning capabilities on top of multiple databases and software

  7. Probabilistic priority assessment of nurse calls.

    PubMed

    Ongenae, Femke; Myny, Dries; Dhaene, Tom; Defloor, Tom; Van Goubergen, Dirk; Verhoeve, Piet; Decruyenaere, Johan; De Turck, Filip

    2014-05-01

    Current nurse call systems are very static. Call buttons are fixed to the wall, and systems do not account for various factors specific to a situation. We have developed a software platform, the ontology-based Nurse Call System (oNCS), which supports the transition to mobile and wireless nurse call buttons and uses an intelligent algorithm to address nurse calls. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff and patients into account by using an ontology. This article describes a probabilistic extension of the oNCS that supports a more sophisticated nurse call algorithm by dynamically assigning priorities to calls based on the risk factors of the patient and the kind of call. The probabilistic oNCS is evaluated through implementation of a prototype and simulations, based on a detailed dataset obtained from 3 nursing departments of Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls among nurses, and the assignment of priorities to calls are compared for the oNCS and the current nurse call system. Additionally, the performance of the system and the parameters of the priority assignment algorithm are explored. The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the probabilistic oNCS significantly improves the assignment of nurses to calls. Calls generally result in a nurse being present more quickly, the workload distribution among the nurses improves, and the priorities and kinds of calls are taken into account.

  8. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  9. A Probabilistic Tsunami Hazard Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Gonzalez, Frank; Geist, Eric; Jaffe, Bruce; Kanoglu, Utku; Mofjeld, Harold; Synolakis, Costas; Titov, Vasily; Arcas, Diego

    2010-05-01

    A methodology for probabilistic tsunami hazard assessment (PTHA) will be described for multiple near- and far-field seismic sources. The method integrates tsunami inundation modeling with the approach of probabilistic seismic hazard assessment (PSHA). A database of inundation simulations is developed, with each simulation corresponding to an earthquake source for which the seismic parameters and mean interevent time have been estimated. A Poissonian model is then adopted for estimating the probability that tsunami flooding will exceed a given level during a specified period of time, taking into account multiple sources and multiple causes of uncertainty. Uncertainty in the tidal stage at tsunami arrival is dealt with by developing a parametric expression for the probability density function of the sum of the tides and a tsunami; uncertainty in the slip distribution of the near-field source is dealt with probabilistically by considering multiple sources in which width and slip values vary, subject to the constraint of a constant moment magnitude. The method was applied to Seaside, Oregon, to obtain estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. These results will be presented and discussed, including the primary remaining sources of uncertainty -- those associated with interevent time estimates, the modeling of background sea level, and temporal changes in bathymetry and topography. PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk.

  10. Risk-Based Disposal Plan for PCB Paint in the TRA Fluorinel Dissolution Process Mockup and Gamma Facilities Canal

    SciTech Connect

    R. A. Montgomery

    2008-05-01

    This Toxic Substances Control Act Risk-Based Polychlorinated Biphenyl Disposal plan was developed for the Test Reactor Area Fluorinel Dissolution Process Mockup and Gamma Facilities Waste System, located in Building TRA-641 at the Reactor Technology Complex, Idaho National Laboratory Site, to address painted surfaces in the empty canal under 40 CFR 761.62(c) for paint, and under 40 CFR 761.61(c) for PCBs that may have penetrated into the concrete. The canal walls and floor will be painted with two coats of contrasting non-PCB paint and labeled as PCB. The canal is covered with open decking; the access grate is locked shut and signed to indicate PCB contamination in the canal. Access to the canal will require facility manager permission. Protective equipment for personnel and equipment entering the canal will be required. Waste from the canal, generated during ultimate Decontamination and Decommissioning, shall be managed and disposed as PCB Bulk Product Waste.

  11. Inferring cellular networks using probabilistic graphical models.

    PubMed

    Friedman, Nir

    2004-02-06

    High-throughput genome-wide molecular assays, which probe cellular networks from different perspectives, have become central to molecular biology. Probabilistic graphical models are useful for extracting meaningful biological insights from the resulting data sets. These models provide a concise representation of complex cellular networks by composing simpler submodels. Procedures based on well-understood principles for inferring such models from data facilitate a model-based methodology for analysis and discovery. This methodology and its capabilities are illustrated by several recent applications to gene expression data.

  12. Probabilistic graphical models for genetic association studies.

    PubMed

    Mourad, Raphaël; Sinoquet, Christine; Leray, Philippe

    2012-01-01

    Probabilistic graphical models have been widely recognized as a powerful formalism in the bioinformatics field, especially in gene expression studies and linkage analysis. Although less well known in association genetics, many successful methods have recently emerged to dissect the genetic architecture of complex diseases. In this review article, we cover the applications of these models to the population association studies' context, such as linkage disequilibrium modeling, fine mapping and candidate gene studies, and genome-scale association studies. Significant breakthroughs of the corresponding methods are highlighted, but emphasis is also given to their current limitations, in particular, to the issue of scalability. Finally, we give promising directions for future research in this field.

  13. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  14. Probabilistic double guarantee kidnapping detection in SLAM.

    PubMed

    Tian, Yang; Ma, Shugen

    2016-01-01

    For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.

  15. Probabilistic Algorithm for Sampler Siting (PASS)

    SciTech Connect

    Lorenzetti, David M.; Sohn, Michael D.

    2007-05-29

    PASS (Probabilistic Approach to Sampler Siting) optimizes the placement of samplers in buildings. The program exhaustively checks every sampler-network that can be formed, evaluating against user-supplied simulations of the possible release scenarios. The program identifies the networks that maximize the probablity of detecting a release from among the suite of user-supllied scenarios. The user may specify how many networks to report, in order to provide a number of choices in cases where many networks have very similar behavior.

  16. a Probabilistic Approach to Robotic Nde Inspection

    NASA Astrophysics Data System (ADS)

    Summan, R.; Dobie, G.; Hensman, J.; Pierce, S. G.; Worden, K.

    2010-02-01

    The application of wireless robotic inspection vehicles equipped with different NDE payloads has been introduced previously, with emphasis placed on inspection applications in hazardous and inaccessible environments. A particular challenge to the practical application of such robotic inspection lies in the localization of the devices. The authors here consider a probabilistic approach to both the positioning and defect problems by using the location of the robot and the NDE measurements (acquired from the onboard transducers) to make inference about defect existence and position. Using a particle filter approach running locally on the robots, the vehicle location is tracked by fusing noisy redundant data sets supplying positional information.

  17. A periodic probabilistic photonic cluster state generator

    NASA Astrophysics Data System (ADS)

    Fanto, Michael L.; Smith, A. Matthew; Alsing, Paul M.; Tison, Christopher C.; Preble, Stefan F.; Lott, Gordon E.; Osman, Joseph M.; Szep, Attila; Kim, Richard S.

    2014-10-01

    The research detailed in this paper describes a Periodic Cluster State Generator (PCSG) consisting of a monolithic integrated waveguide device that employs four wave mixing, an array of probabilistic photon guns, single mode sequential entanglers and an array of controllable entangling gates between modes to create arbitrary cluster states. Utilizing the PCSG one is able to produce a cluster state with nearest neighbor entanglement in the form of a linear or square lattice. Cluster state resources of this type have been proven to be able to perform universal quantum computation.

  18. Probabilistic Interpolation of Wind Hazard Maps

    NASA Astrophysics Data System (ADS)

    Xu, L.

    2012-12-01

    Wind hazard maps are widely used to compute design loads and to evaluate insurance risks. While building codes often provide these maps for only a few return periods, wind hazard maps for other return periods are often needed for risk assessments. In this study, we evaluate a probabilistic interpolation approach for deriving wind hazard maps for return periods other than those available. The probabilistic interpolation approach assumes that probabilities of wind values in a wind hazard map follow Gumbel distribution. Although most studies have been performed on data from individual weather stations, it remains to be seen how well the Gumbel distribution-based interpolation performs for wind hazard maps. The Gumbel distribution F(V) =exp{-exp[-α(V - u)]} is assumed for wind speed at a wind map location, where α and u are parameters that vary with location. VT = u + α-1lnT is the wind speed of return period T when T is large. If T0 and T1 are two given return periods and T1 is greater, then VT = (1-θ)VT0 + θVT1 where θ = (lnT - lnT0)/(lnT1 - lnT0). Therefore, VT is a weighted average between VT0 and VT1. Here we select the US and Mexican hazard maps to evaluate the probabilistic interpolation method. In ASCE 7-10 wind maps, the basic wind speed has a single value for most inland areas, which is 54, 51, and 47 m/s for 1700-year, 700-year, and 300-year return periods, respectively. We use the 1700-year and 300-year values to obtain the 700-year value using the Gumbel distribution-based interpolation. The computed 700-year value is 50.4 m/s compared to 51 m/s provided in the code, about 1% difference. For coastal regions subjected to hurricane winds, the relative error between the interpolated 700-year values and the original 700-year values are within 2% for most areas except for areas where hurricane zones transition to inland non-hurricane zones; there the relative errors can increase to 4%. The Mexican wind code includes wind maps of three return periods: 10

  19. Enhancement of the Probabilistic CEramic Matrix Composite ANalyzer (PCEMCAN) Computer Code

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2000-01-01

    This report represents a final technical report for Order No. C-78019-J entitled "Enhancement of the Probabilistic Ceramic Matrix Composite Analyzer (PCEMCAN) Computer Code." The scope of the enhancement relates to including the probabilistic evaluation of the D-Matrix terms in MAT2 and MAT9 material properties card (available in CEMCAN code) for the MSC/NASTRAN. Technical activities performed during the time period of June 1, 1999 through September 3, 1999 have been summarized, and the final version of the enhanced PCEMCAN code and revisions to the User's Manual is delivered along with. Discussions related to the performed activities were made to the NASA Project Manager during the performance period. The enhanced capabilities have been demonstrated using sample problems.

  20. The probabilistic cell: implementation of a probabilistic inference by the biochemical mechanisms of phototransduction.

    PubMed

    Houillon, Audrey; Bessière, Pierre; Droulez, Jacques

    2010-09-01

    When we perceive the external world, our brain has to deal with the incompleteness and uncertainty associated with sensory inputs, memory and prior knowledge. In theoretical neuroscience probabilistic approaches have received a growing interest recently, as they account for the ability to reason with incomplete knowledge and to efficiently describe perceptive and behavioral tasks. How can the probability distributions that need to be estimated in these models be represented and processed in the brain, in particular at the single cell level? We consider the basic function carried out by photoreceptor cells which consists in detecting the presence or absence of light. We give a system-level understanding of the process of phototransduction based on a bayesian formalism: we show that the process of phototransduction is equivalent to a temporal probabilistic inference in a Hidden Markov Model (HMM), for estimating the presence or absence of light. Thus, the biochemical mechanisms of phototransduction underlie the estimation of the current state probability distribution of the presence of light. A classical descriptive model describes the interactions between the different molecular messengers, ions, enzymes and channel proteins occurring within the photoreceptor by a set of nonlinear coupled differential equations. In contrast, the probabilistic HMM model is described by a discrete recurrence equation. It appears that the binary HMM has a general solution in the case of constant input. This allows a detailed analysis of the dynamics of the system. The biochemical system and the HMM behave similarly under steady-state conditions. Consequently a formal equivalence can be found between the biochemical system and the HMM. Numerical simulations further extend the results to the dynamic case and to noisy input. All in all, we have derived a probabilistic model equivalent to a classical descriptive model of phototransduction, which has the additional advantage of assigning a

  1. A risk-based approach to robotic mission requirements

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Bourke, Roger D.

    1992-01-01

    A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.

  2. Risk-based assessment of the surety of information systems

    SciTech Connect

    Jansma, R.M.; Fletcher, S.K.; Murphy, M.D.; Lim, J.J.; Wyss, G.D.

    1996-07-01

    When software is used in safety-critical, security-critical, or mission-critical situations, it is imperative to understand and manage the risks involved. A risk assessment methodology and toolset have been developed which are specific to software systems and address a broad range of risks including security, safety, and correct operation. A unique aspect of this methodology is the use of a modeling technique that captures interactions and tradeoffs among risk mitigators. This paper describes the concepts and components of the methodology and presents its application to example systems.

  3. Holistic risk-based environmental decision making: a Native perspective.

    PubMed Central

    Arquette, Mary; Cole, Maxine; Cook, Katsi; LaFrance, Brenda; Peters, Margaret; Ransom, James; Sargent, Elvera; Smoke, Vivian; Stairs, Arlene

    2002-01-01

    Native American Nations have become increasingly concerned about the impacts of toxic substances. Although risk assessment and risk management processes have been used by government agencies to help estimate and manage risks associated with exposure to toxicants, these tools have many inadequacies and as a result have not served Native people well. In addition, resources have not always been adequate to address the concerns of Native Nations, and involvement of Native decision makers on a government-to-government basis in discussions regarding risk has only recently become common. Finally, because the definitions of health used by Native people are strikingly different from that of risk assessors, there is also a need to expand current definitions and incorporate traditional knowledge into decision making. Examples are discussed from the First Environment Restoration Initiative, a project that is working to address toxicant issues facing the Mohawk territory of Akwesasne. This project is developing a community-defined model in which health is protected at the same time that traditional cultural practices, which have long been the key to individual and community health, are maintained and restored. PMID:11929736

  4. Perception of Speech Reflects Optimal Use of Probabilistic Speech Cues

    ERIC Educational Resources Information Center

    Clayards, Meghan; Tanenhaus, Michael K.; Aslin, Richard N.; Jacobs, Robert A.

    2008-01-01

    Listeners are exquisitely sensitive to fine-grained acoustic detail within phonetic categories for sounds and words. Here we show that this sensitivity is optimal given the probabilistic nature of speech cues. We manipulated the probability distribution of one probabilistic cue, voice onset time (VOT), which differentiates word initial labial…

  5. The Role of Language in Building Probabilistic Thinking

    ERIC Educational Resources Information Center

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  6. Understanding Probabilistic Thinking: The Legacy of Efraim Fischbein.

    ERIC Educational Resources Information Center

    Greer, Brian

    2001-01-01

    Honors the contribution of Efraim Fischbein to the study and analysis of probabilistic thinking. Summarizes Fischbein's early work, then focuses on the role of intuition in mathematical and scientific thinking; the development of probabilistic thinking; and the influence of instruction on that development. (Author/MM)

  7. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  8. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  9. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1987-01-01

    Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  10. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  11. Damage identification with probabilistic neural networks

    SciTech Connect

    Klenke, S.E.; Paez, T.L.

    1995-12-01

    This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework, it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  12. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  13. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2007-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  14. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  15. Probabilistic adaptation in changing microbial environments

    PubMed Central

    Springer, Michael

    2016-01-01

    Microbes growing in animal host environments face fluctuations that have elements of both randomness and predictability. In the mammalian gut, fluctuations in nutrient levels and other physiological parameters are structured by the host’s behavior, diet, health and microbiota composition. Microbial cells that can anticipate environmental fluctuations by exploiting this structure would likely gain a fitness advantage (by adapting their internal state in advance). We propose that the problem of adaptive growth in structured changing environments, such as the gut, can be viewed as probabilistic inference. We analyze environments that are “meta-changing”: where there are changes in the way the environment fluctuates, governed by a mechanism unobservable to cells. We develop a dynamic Bayesian model of these environments and show that a real-time inference algorithm (particle filtering) for this model can be used as a microbial growth strategy implementable in molecular circuits. The growth strategy suggested by our model outperforms heuristic strategies, and points to a class of algorithms that could support real-time probabilistic inference in natural or synthetic cellular circuits. PMID:27994963

  16. Probabilistic deployment for multiple sensor systems

    NASA Astrophysics Data System (ADS)

    Qian, Ming; Ferrari, Silvia

    2005-05-01

    The performance of many multi-sensor systems can be significantly improved by using a priori environmental information and sensor data to plan the movements of sensor platforms that are later deployed with the purpose of improving the quality of the final detection and classification results. However, existing path planning algorithms and ad-hoc data processing (e.g., fusion) techniques do not allow for the systematic treatment of multiple and heterogeneous sensors and their platforms. This paper presents a method that combines Bayesian network inference with probabilistic roadmap (PRM) planners to utilize the information obtained by different sensors and their level of uncertainty. The uncertainty of prior sensed information is represented by entropy values obtained from the Bayesian network (BN) models of the respective sensor measurement processes. The PRM algorithm is modified to utilize the entropy distribution in optimizing the path of posterior sensor platforms that have the following objectives: (1) improve the quality of the sensed information, i.e., through fusion, (2) minimize the distance traveled by the platforms, and (3) avoid obstacles. This so-called Probabilistic Deployment (PD) method is applied to a demining system comprised of ground-penetrating radars (GPR), electromagnetic (EMI), and infrared sensors (IR) installed on ground platforms, to detect and classify buried mines. Numerical simulations show that PD is more efficient than path planning techniques that do not utilize a priori information, such as complete coverage, random coverage method, or PRM methods that do not utilize Bayesian inference.

  17. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  18. A framework for probabilistic atlas-based organ segmentation

    NASA Astrophysics Data System (ADS)

    Dong, Chunhua; Chen, Yen-Wei; Foruzan, Amir Hossein; Han, Xian-Hua; Tateyama, Tomoko; Wu, Xing

    2016-03-01

    Probabilistic atlas based on human anatomical structure has been widely used for organ segmentation. The challenge is how to register the probabilistic atlas to the patient volume. Additionally, there is the disadvantage that the conventional probabilistic atlas may cause a bias toward the specific patient study due to a single reference. Hence, we propose a template matching framework based on an iterative probabilistic atlas for organ segmentation. Firstly, we find a bounding box for the organ based on human anatomical localization. Then, the probabilistic atlas is used as a template to find the organ in this bounding box by using template matching technology. Comparing our method with conventional and recently developed atlas-based methods, our results show an improvement in the segmentation accuracy for multiple organs (p < 0:00001).

  19. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  20. A risk-based decision support framework for selection of appropriate safety measure system for underground coal mines.

    PubMed

    Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar

    2017-03-01

    In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.

  1. Risk-based Multiobjective Optimization Model for Bridge Maintenance Planning

    SciTech Connect

    Yang, I-T.; Hsu, Y.-S.

    2010-05-21

    Determining the optimal maintenance plan is essential for successful bridge management. The optimization objectives are defined in the forms of minimizing life-cycle cost and maximizing performance indicators. Previous bridge maintenance models assumed the process of bridge deterioration and the estimate of maintenance cost are deterministic, i.e., known with certainty. This assumption, however, is invalid especially with estimates over a long time horizon of bridge life. In this study, we consider the risks associated with bridge deterioration and maintenance cost in determining the optimal maintenance plan. The decisions variables include the strategic choice of essential maintenance (such as silane treatment and cathodic protection), and the intervals between periodic maintenance. A epsilon-constrained Particle Swarm Optimization algorithm is used to approximate the tradeoff between life-cycle cost and performance indicators. During stochastic search for optimal solutions, Monte-Carlo simulation is used to evaluate the impact of risks on the objective values, at an acceptance level of reliability. The proposed model can facilitate decision makers to select the compromised maintenance plan with a group of alternative choices, each of which leads to a different level of performance and life-cycle cost. A numerical example is used to illustrate the proposed model.

  2. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power

  3. Probabilistic Flash Flood Forecasting using Stormscale Ensembles

    NASA Astrophysics Data System (ADS)

    Hardy, J.; Gourley, J. J.; Kain, J. S.; Clark, A.; Novak, D.; Hong, Y.

    2013-12-01

    Flash flooding is one of the most costly and deadly natural hazards in the US and across the globe. The loss of life and property from flash floods could be mitigated with better guidance from hydrological models, but these models have limitations. For example, they are commonly initialized using rainfall estimates derived from weather radars, but the time interval between observations of heavy rainfall and a flash flood can be on the order of minutes, particularly for small basins in urban settings. Increasing the lead time for these events is critical for protecting life and property. Therefore, this study advances the use of quantitative precipitation forecasts (QPFs) from a stormscale NWP ensemble system into a distributed hydrological model setting to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Rainfall error characteristics of the individual members are first diagnosed and quantified in terms of structure, amplitude, and location (SAL; Wernli et al., 2008). Amplitude and structure errors are readily correctable due to their diurnal nature, and the fine scales represented by the CAPS QPF members are consistent with radar-observed rainfall, mainly showing larger errors with afternoon convection. To account for the spatial uncertainty of the QPFs, we use an elliptic smoother, as in Marsh et al. (2012), to produce probabilistic QPFs (PQPFs). The elliptic smoother takes into consideration underdispersion, which is notoriously associated with stormscale ensembles, and thus, is good for targeting the approximate regions that may receive heavy rainfall. However, stormscale details contained in individual members are still needed to yield reasonable flash flood simulations. Therefore, on a case study basis, QPFs from individual members are then run through the hydrological model with their predicted structure and corrected amplitudes, but the locations of individual rainfall elements are perturbed within the PQPF elliptical regions using Monte

  4. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of

  5. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  6. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  7. A Risk-based, Practice-centered Approach to Project Management for HPCMP CREATE

    DTIC Science & Technology

    2015-10-05

    of military air craft, ships, ground vehicles and radio frequency antennas. When CREATE started (~2007), there was no commonly recognized set of...performance of military air craft, ships, ground vehicles and radio frequency antennas. When CREATE started (~2007), there was no commonly recognized set...ships, radio frequency antennas, and ground vehicles. These predictions are designed to reduce the risks, costs, and time to develop and deploy these

  8. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  9. Probabilistic brain atlas encoding using Bayesian inference.

    PubMed

    Van Leemput, Koen

    2006-01-01

    This paper addresses the problem of creating probabilistic brain atlases from manually labeled training data. We propose a general mesh-based atlas representation, and compare different atlas models by evaluating their posterior probabilities and the posterior probabilities of their parameters. Using such a Baysian framework, we show that the widely used "average" brain atlases constitute relatively poor priors, partly because they tend to overfit the training data, and partly because they do not allow to align corresponding anatomical features across datasets. We also demonstrate that much more powerful representations can be built using content-adaptive meshes that incorporate non-rigid deformation field models. We believe extracting optimal prior probability distributions from training data is crucial in light of the central role priors play in many automated brain MRI analysis techniques.

  10. Learning classification with auxiliary probabilistic information.

    PubMed

    Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos

    2011-01-01

    Finding ways of incorporating auxiliary information or auxiliary data into the learning process has been the topic of active data mining and machine learning research in recent years. In this work we study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary (probabilistic) information that reflects how strong the expert feels about the class label. This approach can be extremely useful for many practical classification tasks that rely on subjective label assessment and where the cost of acquiring additional auxiliary information is negligible when compared to the cost of the example analysis and labelling. We develop classification algorithms capable of using the auxiliary information to make the learning process more efficient in terms of the sample complexity. We demonstrate the benefit of the approach on a number of synthetic and real world data sets by comparing it to the learning with class labels only.

  11. Social inequalities in probabilistic labor markets

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Chen, He

    2015-03-01

    We discuss social inequalities in labor markets for university graduates in Japan by using the Gini and k-indices . Feature vectors which specify the abilities of candidates (students) are built-into the probabilistic labor market model. Here we systematically examine what kind of selection processes (strategies) by companies according to the weighted feature vector of each candidate could induce what type of inequalities in the number of informal acceptances leading to a large mismatch between students and companies. This work was financially supported by Grant-in-Aid for Scientific Research (C) of Japan Society for the Promotion of Science (JSPS) No. 2533027803 and Grant-in-Aid for Scientific Research on Innovative Area No. 2512001313.

  12. Probabilistic Resilience in Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Panerati, Jacopo; Beltrame, Giovanni; Schwind, Nicolas; Zeltner, Stefan; Inoue, Katsumi

    2016-05-01

    Originally defined in the context of ecological systems and environmental sciences, resilience has grown to be a property of major interest for the design and analysis of many other complex systems: resilient networks and robotics systems other the desirable capability of absorbing disruption and transforming in response to external shocks, while still providing the services they were designed for. Starting from an existing formalization of resilience for constraint-based systems, we develop a probabilistic framework based on hidden Markov models. In doing so, we introduce two new important features: stochastic evolution and partial observability. Using our framework, we formalize a methodology for the evaluation of probabilities associated with generic properties, we describe an efficient algorithm for the computation of its essential inference step, and show that its complexity is comparable to other state-of-the-art inference algorithms.

  13. Probabilistic view clustering in object recognition

    NASA Astrophysics Data System (ADS)

    Camps, Octavia I.; Christoffel, Douglas W.; Pathak, Anjali

    1992-11-01

    To recognize objects and to determine their poses in a scene we need to find correspondences between the features extracted from the image and those of the object models. Models are commonly represented by describing a few characteristic views of the object representing groups of views with similar properties. Most feature-based matching schemes assume that all the features that are potentially visible in a view will appear with equal probability, and the resulting matching algorithms have to allow for 'errors' without really understanding what they mean. PREMIO is an object recognition system that uses CAD models of 3D objects and knowledge of surface reflectance properties, light sources, sensor characteristics, and feature detector algorithms to estimate the probability of the features being detectable and correctly matched. The purpose of this paper is to describe the predictions generated by PREMIO, how they are combined into a single probabilistic model, and illustrative examples showing its use in object recognition.

  14. Performing Probabilistic Risk Assessment Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  15. Anarchy with hierarchy: A probabilistic appraisal

    NASA Astrophysics Data System (ADS)

    Babu, K. S.; Khanov, Alexander; Saad, Shaikh

    2017-03-01

    The masses of the charged fermion and the mixing angles among quarks are observed to be strongly hierarchical, while analogous parameters in the neutrino sector appear to be structureless or anarchical. We develop a class of unified models based on S U (5 ) symmetry that explains these differing features probabilistically. With the aid of three input parameters that are hierarchical, and with the assumption that all the Yukawa couplings are uncorrelated random variables described by Gaussian distributions, we show by Monte Carlo simulations that the observed features of the entire fermion spectrum can be nicely reproduced. We extend our analysis to an S U (5 )-based flavor U (1 ) model making use of the Froggatt-Nielsen mechanism where the order one Yukawa couplings are modeled as random variables, which also shows good agreement with observations.

  16. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    SciTech Connect

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.

  17. PROBABILISTIC CATALOGS FOR CROWDED STELLAR FIELDS

    SciTech Connect

    Brewer, Brendon J.; Foreman-Mackey, Daniel; Hogg, David W.

    2013-07-01

    We present and implement a probabilistic (Bayesian) method for producing catalogs from images of stellar fields. The method is capable of inferring the number of sources N in the image and can also handle the challenges introduced by noise, overlapping sources, and an unknown point-spread function. The luminosity function of the stars can also be inferred, even when the precise luminosity of each star is uncertain, via the use of a hierarchical Bayesian model. The computational feasibility of the method is demonstrated on two simulated images with different numbers of stars. We find that our method successfully recovers the input parameter values along with principled uncertainties even when the field is crowded. We also compare our results with those obtained from the SExtractor software. While the two approaches largely agree about the fluxes of the bright stars, the Bayesian approach provides more accurate inferences about the faint stars and the number of stars, particularly in the crowded case.

  18. A probabilistic analysis of silicon cost

    NASA Technical Reports Server (NTRS)

    Reiter, L. J.

    1983-01-01

    Silicon materials costs represent both a cost driver and an area where improvement can be made in the manufacture of photovoltaic modules. The cost from three processes for the production of low-cost silicon being developed under the U.S. Department of Energy's (DOE) National Photovoltaic Program is analyzed. The approach is based on probabilistic inputs and makes use of two models developed at the Jet Propulsion Laboratory: SIMRAND (SIMulation of Research ANd Development) and IPEG (Improved Price Estimating Guidelines). The approach, assumptions, and limitations are detailed along with a verification of the cost analyses methodology. Results, presented in the form of cumulative probability distributions for silicon cost, indicate that there is a 55% chance of reaching the DOE target of $16/kg for silicon material. This is a technically achievable cost based on expert forecasts of the results of ongoing research and development and do not imply any market prices for a given year.

  19. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  20. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  1. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  2. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  3. Efficient Probabilistic Diagnostics for Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Chavira, Mark; Cascio, Keith; Poll, Scott; Darwiche, Adnan; Uckun, Serdar

    2008-01-01

    We consider in this work the probabilistic approach to model-based diagnosis when applied to electrical power systems (EPSs). Our probabilistic approach is formally well-founded, as it based on Bayesian networks and arithmetic circuits. We investigate the diagnostic task known as fault isolation, and pay special attention to meeting two of the main challenges . model development and real-time reasoning . often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-to-use speci.cation language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In essence, we introduce a high-level EPS speci.cation language from which Bayesian networks that can diagnose multiple simultaneous failures are auto-generated, and we illustrate the feasibility of using arithmetic circuits, compiled from Bayesian networks, for real-time diagnosis on real-world EPSs of interest to NASA. The experimental system is a real-world EPS, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. In experiments with the ADAPT Bayesian network, which currently contains 503 discrete nodes and 579 edges, we .nd high diagnostic accuracy in scenarios where one to three faults, both in components and sensors, were inserted. The time taken to compute the most probable explanation using arithmetic circuits has a small mean of 0.2625 milliseconds and standard deviation of 0.2028 milliseconds. In experiments with data from ADAPT we also show that arithmetic circuit evaluation substantially outperforms joint tree propagation and variable elimination, two alternative algorithms for diagnosis using Bayesian network inference.

  4. The Probabilistic Admissible Region with Additional Constraints

    NASA Astrophysics Data System (ADS)

    Roscoe, C.; Hussein, I.; Wilkins, M.; Schumacher, P.

    The admissible region, in the space surveillance field, is defined as the set of physically acceptable orbits (e.g., orbits with negative energies) consistent with one or more observations of a space object. Given additional constraints on orbital semimajor axis, eccentricity, etc., the admissible region can be constrained, resulting in the constrained admissible region (CAR). Based on known statistics of the measurement process, one can replace hard constraints with a probabilistic representation of the admissible region. This results in the probabilistic admissible region (PAR), which can be used for orbit initiation in Bayesian tracking and prioritization of tracks in a multiple hypothesis tracking framework. The PAR concept was introduced by the authors at the 2014 AMOS conference. In that paper, a Monte Carlo approach was used to show how to construct the PAR in the range/range-rate space based on known statistics of the measurement, semimajor axis, and eccentricity. An expectation-maximization algorithm was proposed to convert the particle cloud into a Gaussian Mixture Model (GMM) representation of the PAR. This GMM can be used to initialize a Bayesian filter. The PAR was found to be significantly non-uniform, invalidating an assumption frequently made in CAR-based filtering approaches. Using the GMM or particle cloud representations of the PAR, orbits can be prioritized for propagation in a multiple hypothesis tracking (MHT) framework. In this paper, the authors focus on expanding the PAR methodology to allow additional constraints, such as a constraint on perigee altitude, to be modeled in the PAR. This requires re-expressing the joint probability density function for the attributable vector as well as the (constrained) orbital parameters and range and range-rate. The final PAR is derived by accounting for any interdependencies between the parameters. Noting that the concepts presented are general and can be applied to any measurement scenario, the idea

  5. A generic risk-based surveying method for invading plant pathogens.

    PubMed

    Parnell, S; Gottwald, T R; Riley, T; van den Bosch, F

    2014-06-01

    Invasive plant pathogens are increasing with international trade and travel, with damaging environmental and economic consequences. Recent examples include tree diseases such as sudden oak death in the Western United States and ash dieback in Europe. To control an invading pathogen it is crucial that newly infected sites are quickly detected so that measures can be implemented to control the epidemic. However, since sampling resources are often limited, not all locations can be inspected and locations must be prioritized for surveying. Existing approaches to achieve this are often species specific and rely on detailed data collection and parameterization, which is difficult, especially when new arrivals are unanticipated. Consequently regulatory sampling responses are often ad hoc and developed without due consideration of epidemiology, leading to the suboptimal deployment of expensive sampling resources. We introduce a flexible risk-based sampling method that is pathogen generic and enables available information to be utilized to develop epidemiologically informed sampling programs for virtually any biologically relevant plant pathogen. By targeting risk we aim to inform sampling schemes that identify high-impact locations that can be subsequently treated in order to reduce inoculum in the landscape. This "damage limitation" is often the initial management objective following the first discovery of a new invader. Risk at each location is determined by the product of the basic reproductive number (R0), as a measure of local epidemic size, and the probability of infection. We illustrate how the risk estimates can be used to prioritize a survey by weighting a random sample so that the highest-risk locations have the highest probability of selection. We demonstrate and test the method using a high-quality spatially and temporally resolved data set on Huanglongbing disease (HLB) in Florida, USA. We show that even when available epidemiological information is relatively

  6. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  7. EXAMPLE OF A RISK BASED DISPOSAL APPROVAL SOLIDIFICATION OF HANFORD SITE TRANSURANIC (TRU) WASTE

    SciTech Connect

    PRIGNANO AL

    2007-11-14

    The Hanford Site requested, and the U.S. Environmental Protection Agency (EPA) Region 10 approved, a Toxic Substances Control Act of 1976 (TSCA) risk-based disposal approval (RBDA) for solidifying approximately four cubic meters of waste from a specific area of one of the K East Basin: the North Loadout Pit (NLOP). The NLOP waste is a highly radioactive sludge that contained polychlorinated biphenyls (PCBs) regulated under TSCA. The prescribed disposal method for liquid PCB waste under TSCA regulations is either thermal treatment or decontamination. Due to the radioactive nature of the waste, however, neither thermal treatment nor decontamination was a viable option. As a result, the proposed treatment consisted of solidifying the material to comply with waste acceptance criteria at the Waste Isolation Pilot Plant (WPP) in Carlsbad, New Mexico, or possibly the Environmental Restoration Disposal Facility at the Hanford Site, depending on the resulting transuranic (TRU) content of the stabilized waste. The RBDA evaluated environmental risks associated with potential airborne PCBs. In addition, the RBDA made use of waste management controls already in place at the treatment unit. The treatment unit, the T Plant Complex, is a Resource Conservation and Recovery Act of 1976 (RCRA)-permitted facility used for storing and treating radioactive waste. The EPA found that the proposed activities did not pose an unreasonable risk to human health or the environment. Treatment took place from October 26,2005 to June 9,2006, and 332 208-liter (55-gallon) containers of solidified waste were produced. All treated drums assayed to date are TRU and will be disposed at WIPP.

  8. Development of a new adaptive ordinal approach to continuous-variable probabilistic optimization.

    SciTech Connect

    Romero, Vicente JosÔe; Chen, Chun-Hung (George Mason University, Fairfax, VA)

    2006-11-01

    A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks ''Is that alternative better or worse than this one?'' -not ''HOW MUCH better or worse is that alternative to this one?'' The answer to the latter question requires precise characterization of the uncertainty--with the corresponding sampling/integration expense for precise resolution. However, in this report we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared--analogous to the role that Monte Carlo simulation plays in uncertainty propagation.

  9. Development of a Risk-Based Performance Assessment Method for Long-Term Cover Systems--Application to the Monticello Mill Tailings Repository

    SciTech Connect

    HO, CLIFFORD K.; ARNOLD, BILL W.; COCHRAN, JOHN R.; WEBB, STEPHEN W.; TAIRA, RANDAL Y.

    2001-10-01

    A probabilistic, risk-based performance-assessment methodology is being developed to assist designers, regulators, and involved stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report presents an example of the risk-based performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon flux at the surface, groundwater concentrations, and dose. Results of this study can be used to identify engineering and environmental parameters (e.g., liner properties, long-term precipitation, distribution coefficients) that require additional data to reduce uncertainty in the calculations and improve confidence in the model predictions. These results can also be used to evaluate alternative engineering designs and to identify parameters most important to long-term performance.

  10. Risk-Based Ranking Experiences for Cold War Legacy Facilities in the United States

    SciTech Connect

    Droppo, James G.

    2003-05-01

    Over the past two decades, a number of government agencies in the United States have faced increasing public scrutiny for their efforts to address the wide range of potential environmental issues related to Cold War legacies. Risk-based ranking was selected as a means of defining the relative importance of issues. Ambitious facility-wide risk-based ranking applications were undertaken. However, although facility-wide risk-based ranking efforts can build invaluable understanding of the potential issues related to Cold War legacies, conducting such efforts is difficult because of the potentially enormous scope and the potentially strong institutional barriers. The U.S. experience is that such efforts are worth undertaking to start building a knowledge base and infrastructure that are based on a thorough understanding of risk. In both the East and the West, the legacy of the Cold War includes a wide range of potential environmental issues associated with large industrial complexes of weapon production facilities. The responsible agencies or ministries are required to make decisions that could benefit greatly from information on the relative importance of these potential issues. Facility-wide risk-based ranking of potential health and environmental issues is one means to help these decision makers. The initial U.S. risk-based ranking applications described in this chapter were “ground-breaking” in that they defined new methodologies and approaches to meet the challenges. Many of these approaches fit the designation of a population-centred risk assessment. These U.S. activities parallel efforts that are just beginning for similar facilities in the countries of the former Soviet Union. As described below, conducting a facility-wide risk-based ranking has special challenges and potential pitfalls. Little guidance exists to conduct major risk-based rankings. For those considering undertaking such efforts, the material contained in this chapter should be useful

  11. A review of NRC staff uses of probabilistic risk assessment

    SciTech Connect

    Not Available

    1994-03-01

    The NRC staff uses probabilistic risk assessment (PRA) and risk management as important elements its licensing and regulatory processes. In October 1991, the NRC`s Executive Director for Operations established the PRA Working Group to address concerns identified by the Advisory Committee on Reactor Safeguards with respect to unevenness and inconsistency in the staff`s current uses of PRA. After surveying current staff uses of PRA and identifying needed improvements, the Working Group defined a set of basic principles for staff PRA use and identified three areas for improvements: guidance development, training enhancements, and PRA methods development. For each area of improvement, the Working Group took certain actions and recommended additional work. The Working Group recommended integrating its work with other recent PRA-related activities the staff completed and improving staff interactions with PRA users in the nuclear industry. The Working Group took two key actions by developing general guidance for two uses of PRA within the NRC (that is, screening or prioritizing reactor safety issues and analyzing such issues in detail) and developing guidance on basic terms and methods important to the staff`s uses of PRA.

  12. Advanced neutron source reactor probabilistic flow blockage assessment

    SciTech Connect

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.

  13. BOAST 98-MC: A Probabilistic Simulation Module for BOAST 98

    SciTech Connect

    Aiysha Sultana; Anne Oudinot; Reynaldo Gonzalez; Scott Reeves

    2006-09-08

    This work was performed by Advanced Resources International (ARI) on behalf of the United States Department of Energy (DOE) in order to develop a user-friendly, PC-based interface that couples DOE's BOAST 98 software with the Monte Carlo simulation technique. The objectives of the work were to improve reservoir management and maximize oil recoveries by understanding and quantifying reservoir uncertainty as well as improving the capabilities of DOE's BOAST 98 software by incorporating a probabilistic module in the simulator. In this model, probability distributions can be assigned to unknown input parameters such as permeability, porosity, etc. Options have also been added to the input file to be able to vary relative permeability curves as well as well spacing. Hundreds of simulations can then automatically be run to explore the many combinations of uncertain reservoir parameters across their spectrum of uncertainty. Output data such as oil rate and water rate can then be plotted. When historical data are available, they can be uploaded and a least-square error-function run between the simulation data and the history data. The set of input parameters leading to the best match is thus determined. Sensitivity charts (Tornado plots) that rank the uncertain parameters according to the impact they have on the outputs can also be generated.

  14. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  15. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  16. Upward Flammability Testing: A Probabilistic Measurement

    NASA Technical Reports Server (NTRS)

    Davis, Samuel E.; Engel, Carl D.; Richardson, Erin R.

    2003-01-01

    Examination of NASA-STD-6001 Test 1 data suggests burn length outcome for a given environment has a large statistical variation from run to run. Large data sets show that burn length data form cumulative probability distribution curves, which describe a material's characteristic to burn in a specific environment, suggesting that the current practice of testing three samples at specific conditions is inadequate. Sufficient testing can establish material characteristics probability curves to provide the probability that a material will sustain a burn length of at least 15.24 cm (6.0 in.) or will sustain burning until all material is consumed. A simple pasdfail criterion may not be possible or practical. Future application of flammability data for some material classes may require the engineer to assess risk based on the probability of an occurrence and the probable outcome with different materials as characterized with cumulative burn length distributions for specific use conditions.

  17. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  18. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  19. Predicting rib fracture risk with whole-body finite element models: development and preliminary evaluation of a probabilistic analytical framework.

    PubMed

    Forman, Jason L; Kent, Richard W; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5-7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992-2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction.

  20. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  1. Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions

    EPA Pesticide Factsheets

    General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.

  2. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  3. On the assessment of reliability in probabilistic hydrometeorological event forecasting

    NASA Astrophysics Data System (ADS)

    DeChant, Caleb M.; Moradkhani, Hamid

    2015-06-01

    Probabilistic forecasts are commonly used to communicate uncertainty in the occurrence of hydrometeorological events. Although probabilistic forecasting is common, conventional methods for assessing the reliability of these forecasts are approximate. Among the most common methods for assessing reliability, the decomposed Brier Score and Reliability Diagram treat an observed string of events as samples from multiple Binomial distributions, but this is an approximation of the forecast reliability, leading to unnecessary loss of information. This article suggests testing the hypothesis of reliability via the Poisson-Binomial distribution, which is a generalized solution to the Binomial distribution, providing a more accurate model of the probabilistic event forecast verification setting. Further, a two-stage approach to reliability assessment is suggested to identify errors in the forecast related to both bias and overly/insufficiently sharp forecasts. Such a methodology is shown to more effectively distinguish between reliable and unreliable forecasts, leading to more robust probabilistic forecast verification.

  4. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  5. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  6. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  7. Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry

    2014-01-01

    Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.

  8. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  9. Probabilistic Language Framework for Stochastic Discrete Event Systems

    DTIC Science & Technology

    1996-01-01

    Kumar, S.I. Marcus T.R. 96-18 Probabilistic Language Formalism for Stochastic Discrete Event Systems 1 2 Vijay K . Garg Department of Electrical and...Probability Theory and Its Applications, Vol. 1. Wiley, New York, NY, 2nd edition, 1966. [6] V. K . Garg . An algebraic approach to modeling probabilistic...discrete event systems. In Proceedings of 1992 IEEE Conference on Decision and Control, pages 2348{2353, Tucson, AZ, December 1992. [7] V. K . Garg

  10. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  11. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  12. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  13. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C. ); Michael, J.B. )

    1993-01-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  14. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C.; Michael, J.B.

    1993-05-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  15. Probabilistic health risk assessment for ingestion of seafood farmed in arsenic contaminated groundwater in Taiwan.

    PubMed

    Liang, Ching-Ping; Jang, Cheng-Shin; Chen, Jui-Sheng; Wang, Sheng-Wei; Lee, Jin-Jing; Liu, Chen-Wuing

    2013-08-01

    Seafood farmed in arsenic (As)-contaminated areas is a major exposure pathway for the ingestion of inorganic As by individuals in the southwestern part of Taiwan. This study presents a probabilistic risk assessment using limited data for inorganic As intake through the consumption of the seafood by local residents in these areas. The As content and the consumption rate are both treated as probability distributions, taking into account the variability of the amount in the seafood and individual consumption habits. The Monte Carlo simulation technique is utilized to conduct an assessment of exposure due to the daily intake of inorganic As from As-contaminated seafood. Exposure is evaluated according to the provisional tolerable weekly intake (PTWI) established by the FAO/WHO and the target risk based on the US Environmental Protection Agency guidelines. The assessment results show that inorganic As intake from five types of fish (excluding mullet) and shellfish fall below the PTWI threshold values for the 95th percentiles, but exceed the target cancer risk of 10(-6). The predicted 95th percentile for inorganic As intake and lifetime cancer risks obtained in the study are both markedly higher than those obtained in previous studies in which the consumption rate of seafood considered is a deterministic value. This study demonstrates the importance of the individual variability of seafood consumption when evaluating a high exposure sub-group of the population who eat higher amounts of fish and shellfish than the average Taiwanese.

  16. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  17. Probabilistic participation in public goods games.

    PubMed

    Sasaki, Tatsuya; Okada, Isamu; Unemi, Tatsuo

    2007-10-22

    Voluntary participation in public goods games (PGGs) has turned out to be a simple but effective mechanism for promoting cooperation under full anonymity. Voluntary participation allows individuals to adopt a risk-aversion strategy, termed loner. A loner refuses to participate in unpromising public enterprises and instead relies on a small but fixed pay-off. This system leads to a cyclic dominance of three pure strategies, cooperators, defectors and loners, but at the same time, there remain two considerable restrictions: the addition of loners cannot stabilize the dynamics and the time average pay-off for each strategy remains equal to the pay-off of loners. Here, we introduce probabilistic participation in PGGs from the standpoint of diversification of risk, namely simple mixed strategies with loners, and prove the existence of a dynamical regime in which the restrictions ono longer hold. Considering two kinds of mixed strategies associated with participants (cooperators or defectors) and non-participants (loners), we can recover all basic evolutionary dynamics of the two strategies: dominance; coexistence; bistability; and neutrality, as special cases depending on pairs of probabilities. Of special interest is that the expected pay-off of each mixed strategy exceeds the pay-off of loners at some interior equilibrium in the coexistence region.

  18. Learning probabilistic phenotypes from heterogeneous EHR data.

    PubMed

    Pivovarov, Rimma; Perotte, Adler J; Grave, Edouard; Angiolillo, John; Wiggins, Chris H; Elhadad, Noémie

    2015-12-01

    We present the Unsupervised Phenome Model (UPhenome), a probabilistic graphical model for large-scale discovery of computational models of disease, or phenotypes. We tackle this challenge through the joint modeling of a large set of diseases and a large set of clinical observations. The observations are drawn directly from heterogeneous patient record data (notes, laboratory tests, medications, and diagnosis codes), and the diseases are modeled in an unsupervised fashion. We apply UPhenome to two qualitatively different mixtures of patients and diseases: records of extremely sick patients in the intensive care unit with constant monitoring, and records of outpatients regularly followed by care providers over multiple years. We demonstrate that the UPhenome model can learn from these different care settings, without any additional adaptation. Our experiments show that (i) the learned phenotypes combine the heterogeneous data types more coherently than baseline LDA-based phenotypes; (ii) they each represent single diseases rather than a mix of diseases more often than the baseline ones; and (iii) when applied to unseen patient records, they are correlated with the patients' ground-truth disorders. Code for training, inference, and quantitative evaluation is made available to the research community.

  19. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  20. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  1. Probabilistic Tomography: A Pragmatic Bayesian Approach

    NASA Astrophysics Data System (ADS)

    Trampert, J.

    2014-12-01

    'The future lies in uncertainty' (Spiegelhalter, Science, 345, 264, 2014), nothing could be more true for Earth Sciences. We are able to produce ever more sophisticated models but they can only inform us about the Earth in a meaningful way if we can assign uncertainties to the models. Bayesian inference is a natural choice for this task as it handles uncertainty in a natural way by explicitly modeling assumptions. Another desirable property is that Bayes' theorem contains Occam's razor implicitly. I will present our efforts over the that last 10 years to infer Earth properties using an approach we called probabilistic tomography. The word pragmatic has several meanings in this context. In more classical Bayesian inference problems, we usually prescribe subjective or informative priors. I will illustrate this by showing examples which employ the neighborhood algorithm (Sambridge, 1999) or a Metropolis rule (Mosegaard and Tarantola, 1995). Recently we started to use neural networks to parametrize the posterior. In our implementation, we do not sample the posterior directly, but make predictions on some properties of the posterior. The interpretation of the uncertainty is therefore slightly different, but the method informs us on the information gain with respect to the prior. I will show examples on source and structural inversions using so-called mixture density networks.

  2. Probabilistic sequence alignment of stratigraphic records

    NASA Astrophysics Data System (ADS)

    Lin, Luan; Khider, Deborah; Lisiecki, Lorraine E.; Lawrence, Charles E.

    2014-10-01

    The assessment of age uncertainty in stratigraphically aligned records is a pressing need in paleoceanographic research. The alignment of ocean sediment cores is used to develop mutually consistent age models for climate proxies and is often based on the δ18O of calcite from benthic foraminifera, which records a global ice volume and deep water temperature signal. To date, δ18O alignment has been performed by manual, qualitative comparison or by deterministic algorithms. Here we present a hidden Markov model (HMM) probabilistic algorithm to find 95% confidence bands for δ18O alignment. This model considers the probability of every possible alignment based on its fit to the δ18O data and transition probabilities for sedimentation rate changes obtained from radiocarbon-based estimates for 37 cores. Uncertainty is assessed using a stochastic back trace recursion to sample alignments in exact proportion to their probability. We applied the algorithm to align 35 late Pleistocene records to a global benthic δ18O stack and found that the mean width of 95% confidence intervals varies between 3 and 23 kyr depending on the resolution and noisiness of the record's δ18O signal. Confidence bands within individual cores also vary greatly, ranging from ~0 to >40 kyr. These alignment uncertainty estimates will allow researchers to examine the robustness of their conclusions, including the statistical evaluation of lead-lag relationships between events observed in different cores.

  3. A probabilistic model for binaural sound localization.

    PubMed

    Willert, Volker; Eggert, Julian; Adamy, Jürgen; Stahl, Raphael; Körner, Edgar

    2006-10-01

    This paper proposes a biologically inspired and technically implemented sound localization system to robustly estimate the position of a sound source in the frontal azimuthal half-plane. For localization, binaural cues are extracted using cochleagrams generated by a cochlear model that serve as input to the system. The basic idea of the model is to separately measure interaural time differences and interaural level differences for a number of frequencies and process these measurements as a whole. This leads to two-dimensional frequency versus time-delay representations of binaural cues, so-called activity maps. A probabilistic evaluation is presented to estimate the position of a sound source over time based on these activity maps. Learned reference maps for different azimuthal positions are integrated into the computation to gain time-dependent discrete conditional probabilities. At every timestep these probabilities are combined over frequencies and binaural cues to estimate the sound source position. In addition, they are propagated over time to improve position estimation. This leads to a system that is able to localize audible signals, for example human speech signals, even in reverberating environments.

  4. Astrobiological Complexity with Probabilistic Cellular Automata

    NASA Astrophysics Data System (ADS)

    Vukotić, Branislav; Ćirković, Milan M.

    2012-08-01

    The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.

  5. Astrobiological complexity with probabilistic cellular automata.

    PubMed

    Vukotić, Branislav; Ćirković, Milan M

    2012-08-01

    The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.

  6. Probabilistic Seismic Hazard Assessment for Iraq

    SciTech Connect

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq; Shakir, Ammar M.; Mahdi, Hanan; Numan, Nazar M.S.; Al-Shukri, Haydar; Chlaib, Hussein K.; Ameen, Taher H.; Abd, Najah A.

    2016-05-06

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al., 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.

  7. Probabilistic Gompertz model of irreversible growth.

    PubMed

    Bardos, D C

    2005-05-01

    Characterizing organism growth within populations requires the application of well-studied individual size-at-age models, such as the deterministic Gompertz model, to populations of individuals whose characteristics, corresponding to model parameters, may be highly variable. A natural approach is to assign probability distributions to one or more model parameters. In some contexts, size-at-age data may be absent due to difficulties in ageing individuals, but size-increment data may instead be available (e.g., from tag-recapture experiments). A preliminary transformation to a size-increment model is then required. Gompertz models developed along the above lines have recently been applied to strongly heterogeneous abalone tag-recapture data. Although useful in modelling the early growth stages, these models yield size-increment distributions that allow negative growth, which is inappropriate in the case of mollusc shells and other accumulated biological structures (e.g., vertebrae) where growth is irreversible. Here we develop probabilistic Gompertz models where this difficulty is resolved by conditioning parameter distributions on size, allowing application to irreversible growth data. In the case of abalone growth, introduction of a growth-limiting biological length scale is then shown to yield realistic length-increment distributions.

  8. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    SciTech Connect

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-07-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics.

  9. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  10. Probabilistic Analysis of Localized DNA Hybridization Circuits.

    PubMed

    Dalchau, Neil; Chandran, Harish; Gopalkrishnan, Nikhil; Phillips, Andrew; Reif, John

    2015-08-21

    Molecular devices made of nucleic acids can perform complex information processing tasks at the nanoscale, with potential applications in biofabrication and smart therapeutics. However, limitations in the speed and scalability of such devices in a well-mixed setting can significantly affect their performance. In this article, we propose designs for localized circuits involving DNA molecules that are arranged on addressable substrates and interact via hybridization reactions. We propose designs for localized elementary logic circuits, which we compose to produce more complex devices, including a circuit for computing the square root of a four bit number. We develop an efficient method for probabilistic model checking of localized circuits, which we implement within the Visual DSD design tool. We use this method to prove the correctness of our circuits with respect to their functional specifications and to analyze their performance over a broad range of local rate parameters. Specifically, we analyze the extent to which our localized designs can overcome the limitations of well-mixed circuits, with respect to speed and scalability. To provide an estimate of local rate parameters, we propose a biophysical model of localized hybridization. Finally, we use our analysis to identify constraints in the rate parameters that enable localized circuits to retain their advantages in the presence of unintended interferences between strands.

  11. Probabilistic seismic hazard estimation of Manipur, India

    NASA Astrophysics Data System (ADS)

    Pallav, Kumar; Raghukanth, S. T. G.; Darunkumar Singh, Konjengbam

    2012-10-01

    This paper deals with the estimation of spectral acceleration for Manipur based on probabilistic seismic hazard analysis (PSHA). The 500 km region surrounding Manipur is divided into seven tectonic zones and major faults located in these zones are used to estimate seismic hazard. The earthquake recurrence relations for the seven zones have been estimated from past seismicity data. Ground motion prediction equations proposed by Boore and Atkinson (2008 Earthq. Spectra 24 99-138) for shallow active regions and Atkinson and Boore (2003 Bull. Seismol. Soc. Am. 93 1703-29) for the Indo-Burma subduction zone are used for estimating ground motion. The uniform hazard response spectra for all the nine constituent districts of Manipur (Senapati, Tamenglong, Churachandpur, Chandel, Imphal east, Imphal west, Ukhrul, Thoubal and Bishnupur) at 100-, 500- and 2500-year return periods have been computed from PSHA. A contour map of peak ground acceleration over Manipur is also presented for 100-, 500-, and 2500-year return periods with variations of 0.075-0.225, 0.18-0.63 and 0.3-0.1.15 g, respectively, throughout the state. These results may be of use to planners and engineers for site selection, designing earthquake resistant structures and, further, may help the state administration in seismic hazard mitigation.

  12. Probabilistic risk analysis of groundwater remediation strategies

    NASA Astrophysics Data System (ADS)

    Bolster, D.; Barahona, M.; Dentz, M.; Fernandez-Garcia, D.; Sanchez-Vila, X.; Trinchero, P.; Valhondo, C.; Tartakovsky, D. M.

    2009-06-01

    Heterogeneity of subsurface environments and insufficient site characterization are some of the reasons why decisions about groundwater exploitation and remediation have to be made under uncertainty. A typical decision maker chooses between several alternative remediation strategies by balancing their respective costs with the probability of their success or failure. We conduct a probabilistic risk assessment (PRA) to determine the likelihood of the success of a permeable reactive barrier, one of the leading approaches to groundwater remediation. While PRA is used extensively in many engineering fields, its applications in hydrogeology are scarce. This is because rigorous PRA requires one to quantify structural and parametric uncertainties inherent in predictions of subsurface flow and transport. We demonstrate how PRA can facilitate a comprehensive uncertainty quantification for complex subsurface phenomena by identifying key transport processes contributing to a barrier's failure, each of which is amenable to uncertainty analysis. Probability of failure of a remediation strategy is computed by combining independent and conditional probabilities of failure of each process. Individual probabilities can be evaluated either analytically or numerically or, barring both, can be inferred from expert opinion.

  13. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-04-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  14. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-01-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  15. A probabilistic approach to controlling crevice chemistry

    SciTech Connect

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-12-31

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved.

  16. Probabilistic Seismic Hazards Update for LLNL

    SciTech Connect

    Menchawi, O.; Fernandez, A.

    2016-03-30

    Fugro Consultants, Inc. (FCL) completed the Probabilistic Seismic Hazard Analysis (PSHA) performed for Building 332 at the Lawrence Livermore National Laboratory (LLNL), near Livermore, CA. The study performed for the LLNL site includes a comprehensive review of recent information relevant to the LLNL regional tectonic setting and regional seismic sources in the vicinity of the site and development of seismic wave transmission characteristics. The Seismic Source Characterization (SSC), documented in Project Report No. 2259-PR-02 (FCL, 2015b), and Ground Motion Characterization (GMC), documented in Project Report No. 2259-PR-06 (FCL, 2015a) were developed in accordance with ANS/ANSI 2.29- 2008 Level 2 PSHA guidelines. The ANS/ANSI 2.29-2008 Level 2 PSHA framework is documented in Project Report No. 2259-PR-05 (FCL, 2016a). The Hazard Input Document (HID) for input into the PSHA developed from the SSC and GMC is presented in Project Report No. 2259-PR-04 (FCL, 2016b). The site characterization used as input for development of the idealized site profiles including epistemic uncertainty and aleatory variability is presented in Project Report No. 2259-PR-03 (FCL, 2015c). The PSHA results are documented in Project Report No. 2259-PR-07 (FCL, 2016c).

  17. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  18. The probabilistic distribution of metal whisker lengths

    SciTech Connect

    Niraula, D. Karpov, V. G.

    2015-11-28

    Significant reliability concerns in multiple industries are related to metal whiskers, which are random high aspect ratio filaments growing on metal surfaces and causing shorts in electronic packages. We derive a closed form expression for the probabilistic distribution of metal whisker lengths. Our consideration is based on the electrostatic theory of metal whiskers, according to which whisker growth is interrupted when its tip enters a random local “dead region” of a weak electric field. Here, we use the approximation neglecting the possibility of thermally activated escapes from the “dead regions,” which is later justified. We predict a one-parameter distribution with a peak at a length that depends on the metal surface charge density and surface tension. In the intermediate range, it fits well the log-normal distribution used in the experimental studies, although it decays more rapidly in the range of very long whiskers. In addition, our theory quantitatively explains how the typical whisker concentration is much lower than that of surface grains. Finally, it predicts the stop-and-go phenomenon for some of the whiskers growth.

  19. 76 FR 41602 - Fair Credit Reporting Risk-Based Pricing Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... ``information overload'' for consumers, and detract from the primary purpose of the credit score information... respective risk-based pricing rules to require disclosure of credit scores and information relating to credit... rules are effective August 15, 2011. FOR FURTHER INFORMATION CONTACT: Board: Krista P. Ayoub,...

  20. 76 FR 39885 - Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... SECURITY Coast Guard Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs) AGENCY... Drilling Units (MODUs). This policy letter announces changes to the Coast Guard's system used to prioritize... drilling unit (MODU) must undergo a Coast Guard Certificate of Compliance (COC) examination in order...