Science.gov

Sample records for probabilistic risk-based management

  1. Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Roberts, Mike; Rajagopalan, Balaji; Sojda, Richard S.

    2013-08-01

    Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very "sharp"). Synthetic forecasts show that a modest "sharpening" can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.

  2. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  3. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  4. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    SciTech Connect

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  5. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  6. A risk-based decision-making game relevant to water management. Try it yourself!

    NASA Astrophysics Data System (ADS)

    Pappenberger, Florian; van Andel, Schalk Jan; Wood, Andy; Ramos, Maria-Helena

    2013-04-01

    Monthly or seasonal streamflow forecasts are essential to improve water planning (eg., water allocation) and anticipate severe events like droughts. Additionally, multipurpose water reservoirs usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. Given the need to take into account uncertainties at long lead times to allow for optimal risk-based decisions, the use of probabilistic forecasts in this context is inevitable. In this presentation, we will engage a risk-based decision-making game, where each participant will act as a water manager. A sequence of probabilistic inflow forecasts will be presented to be used to make a reservoir release decision at a monthly time-step, subject to a few constraints -- e.g., an end of year target pool elevation, a maximum release and a minimum downstream flow. After each decision, the actual inflow will be presented and the consequences of the decisions made will be discussed together with the participants of the session. This experience will allow participants to experience firsthand the challenges of probabilistic, quantitative decision-making.

  7. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  8. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  9. Promoting justified risk-based decisions in contaminated land management.

    PubMed

    Reinikainen, Jussi; Sorvari, Jaana

    2016-09-01

    Decision making and regulatory policies on contaminated land management (CLM) are commonly governed by risk assessment. Risk assessment, thus, has to comply with legislation, but also provide valid information in terms of actual risks to correctly focus the potentially required measures and allocate the available resources. Hence, reliable risk assessment is a prerequisite for justified and sustainable risk management. This paper gives an introduction to the Finnish risk-based regulatory framework, outlines the challenges within the policies and the practice and provides an overview of the new guidance document to promote risk-based and sustainable CLM. We argue that the current risk assessment approaches in the policy frameworks are not necessarily efficient enough in supporting justified risk-based decisions. One of the main reasons for this is the excessive emphasis put on conservative risk assessments and on generic guideline values without contributing to their appropriate application. This paper presents how some of the challenges in risk-based decision making have been tackled in the Finnish regulatory framework on contaminated land. We believe that our study will also stimulate interest with regard to policy frameworks in other countries. PMID:26767620

  10. Science, science policy, and risk-based management

    SciTech Connect

    Midgley, L.P.

    1997-09-01

    Recent national awareness of the economic infeasibility of remediating hazardous waste sites to background levels has sparked increased interest in the role of science policy in the environmental risk assessment and risk management process. As individual states develop guidelines for addressing environmental risks at hazardous waste sites, the role of science policy decisions and uncertainty must be carefully evaluated to achieve long-term environmental goals and solutions that are economically feasible and optimally beneficial to all stakeholders. Amendment to Oregon Revised Statute 465.315 establishes policy and Utah Cleanup Action and Risk-Based Closure Standards sets requirements for risk-based cleanup and closure at sites where remediation or removal of hazardous constituents to background levels will not be achieved. This paper discusses the difficulties in effectively integrating potential current and future impacts on human health and the environment, technical feasibility, economic considerations, and political realities into environmental policy and standards, using these references as models. This paper considers the role of both objective and subjective criteria in the risk-based closure and management processes and makes suggestions for improving the system by which these sites may be reclaimed.

  11. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  12. Application of probabilistic safety assessment models to risk-based inspection of piping

    SciTech Connect

    Chapman, J.

    1996-12-01

    From the beginning, one of the most useful applications of Probabilistic Safety Assessment (PSA) is its use in evaluating the risk importance of changes to plant design, operations, or other plant conditions. Risk importance measures the impact of a change on the risk. Risk is defined as a combination of the likelihood of failure and consequence of the failure. The consequence can be safety system unavailability, core melt frequency, early release, or various other consequence measures. The goal in this PSA application is to evaluate the risk importance of an ISI process, as applied to plant piping systems. Two approaches can be taken in this evaluation: Current PSA Approach or the Blended Approach. Both are discussed here.

  13. Risk-based principles for defining and managing water security.

    PubMed

    Hall, Jim; Borgomeo, Edoardo

    2013-11-13

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  14. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  15. Towards risk-based drought management in the Netherlands: quantifying the welfare effects of water shortage

    NASA Astrophysics Data System (ADS)

    van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens

    2016-04-01

    It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some

  16. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  17. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  18. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  19. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  20. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can

  1. The effects of climate model similarity on probabilistic climate projections and the implications for local, risk-based adaptation planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, Scott; McCrary, Rachel; Mearns, Linda O.; Brown, Casey

    2015-06-01

    Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate-related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.

  2. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  3. A risk-based probabilistic framework to estimate the endpoint of remediation: Concentration rebound by rate-limited mass transfer

    NASA Astrophysics Data System (ADS)

    Barros, F. P. J.; Fernã Ndez-Garcia, D.; Bolster, D.; Sanchez-Vila, X.

    2013-04-01

    Aquifer remediation is a challenging problem with environmental, social, and economic implications. As a general rule, pumping proceeds until the concentration of the target substance within the pumped water lies below a prespecified value. In this paper we estimate the a priori potential failure of the endpoint of remediation due to a rebound of concentrations driven by back diffusion. In many cases, it has been observed that once pumping ceases, a rebound in the concentration at the well takes place. For this reason, administrative approaches are rather conservative, and pumping is forced to last much longer than initially expected. While a number of physical and chemical processes might account for the presence of rebounding, we focus here on diffusion from low water mobility into high mobility zones. In this work we look specifically at the concentration rebound when pumping is discontinued while accounting for multiple mass transfer processes occurring at different time scales and parametric uncertainty. We aim to develop a risk-based optimal operation methodology that is capable of estimating the endpoint of remediation based on aquifer parameters characterizing the heterogeneous medium as well as pumping rate and initial size of the polluted area.

  4. Method for Water Management Considering Long-term Probabilistic Forecasts

    NASA Astrophysics Data System (ADS)

    Hwang, J.; Kang, J.; Suh, A. S.

    2015-12-01

    This research is aimed at predicting the monthly inflow of the Andong-dam basin in South Korea using long-term probabilistic forecasts to apply long-term forecasts to water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  5. A risk-based framework for water resource management under changing water availability, policy options, and irrigation expansion

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2016-08-01

    resemble nonlinear functions of changes in individual drivers. The proposed risk-based framework can be linked to any water resource system assessment scheme to quantify the risk in system performance under changing conditions, with the larger goal of proposing alternative policy options to address future uncertainties and management concerns.

  6. Probabilistic structural risk assessment for fatigue management using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Shiao, Michael; Wu, Y.-T. J.; Ghoshal, Anindya; Ayers, James; Le, Dy

    2012-04-01

    The primary goal of Army Prognostics & Diagnostics is to develop real-time state awareness technologies for primary structural components. In fatigue-critical structural maintenance, the probabilistic structural risk assessment (PSRA) methodology for fatigue life management using conventional nondestructive investigation (NDI) has been developed based on the assumption of independent inspection outcomes. When using the emerging structural health monitoring (SHM) systems with in situ sensors, however, the independent assumption no longer holds, and the existing PSRA methodology must be modified. The major issues currently under investigation are how to properly address the correlated inspection outcomes from the same sensors on the same components and how to quantify its effect in the SHM-based PSRA framework. This paper describes a new SHM-based PSRA framework with a proper modeling of correlations among multiple inspection outcomes of the same structural component. The framework and the associated probabilistic algorithms are based on the principles of fatigue damage progression, NDI reliability assessment and structural reliability methods. The core of this framework is an innovative, computationally efficient, probabilistic method RPI (Recursive Probability Integration) for damage tolerance and risk-based maintenance planning. RPI can incorporate a wide range of uncertainties including material properties, repair quality, crack growth related parameters, loads, and probability of detection. The RPI algorithm for SHM application is derived in detail. The effects of correlation strength and inspection frequency on the overall probability of missing all detections are also studied and discussed.

  7. The role of risk-based prioritization in total quality management

    SciTech Connect

    Bennett, C.T.

    1994-10-01

    The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approach - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.

  8. Developing risk-based screening guidelines for dioxin management at a Melbourne sewage treatment plant.

    PubMed

    Gorman, J; Mival, K; Wright, J; Howell, M

    2003-01-01

    Dioxin is a generic term used to refer to the congeners of polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs). The principal source of dioxin production is generally thought to be from unintended by-products of waste incineration, but dioxins are also naturally formed from volcanic activity and forest fires (WHO, 1998). Estimates of dioxin emissions in Australia suggest that approximately 75% of the total PCDD and PCDF emissions derive from prescribed burning and wild bushfires. Currently, no screening guidelines for dioxins within soils are available in Australia. This paper presents the general approach and results of a human health risk-based assessment performed by URS Australia in 2001 to develop site specific reference criteria for remediation of a former sewage plant in Melbourne. Risk-based soil remediation concentrations for dioxins at the sewage treatment plant site were developed using tolerable daily intake values of 4, 2 and 1 pg/kg/day. The potentially significant exposure pathways and processes for exposure to dioxins were identified and risk-based soil concentrations derived in accordance with the general method framework presented in the National Environmental Protection Measure (Assessment of Site Contamination). The derived dioxin reference criteria were used to develop an effective risk management program focussed on those conditions that present the greatest contribution to overall risk to human health. PMID:12862210

  9. Waste management project's alternatives: A risk-based multi-criteria assessment (RBMCA) approach

    SciTech Connect

    Karmperis, Athanasios C.; Sotirchos, Anastasios; Aravossis, Konstantinos; Tatsiopoulos, Ilias P.

    2012-01-15

    Highlights: Black-Right-Pointing-Pointer We examine the evaluation of a waste management project's alternatives. Black-Right-Pointing-Pointer We present a novel risk-based multi-criteria assessment (RBMCA) approach. Black-Right-Pointing-Pointer In the RBMCA the evaluation criteria are based on the quantitative risk analysis of the project's alternatives. Black-Right-Pointing-Pointer Correlation between the criteria weight values and the decision makers' risk preferences is examined. Black-Right-Pointing-Pointer Preference to the multi-criteria against the one-criterion evaluation process is discussed. - Abstract: This paper examines the evaluation of a waste management project's alternatives through a quantitative risk analysis. Cost benefit analysis is a widely used method, in which the investments are mainly assessed through the calculation of their evaluation indicators, namely benefit/cost (B/C) ratios, as well as the quantification of their financial, technical, environmental and social risks. Herein, a novel approach in the form of risk-based multi-criteria assessment (RBMCA) is introduced, which can be used by decision makers, in order to select the optimum alternative of a waste management project. Specifically, decision makers use multiple criteria, which are based on the cumulative probability distribution functions of the alternatives' B/C ratios. The RBMCA system is used for the evaluation of a waste incineration project's alternatives, where the correlation between the criteria weight values and the decision makers' risk preferences is analyzed and useful conclusions are discussed.

  10. Achievements of risk-based produced water management on the Norwegian continental shelf (2002-2008).

    PubMed

    Smit, Mathijs G D; Frost, Tone K; Johnsen, Ståle

    2011-10-01

    In 1996, the Norwegian government issued a White Paper requiring the Norwegian oil industry to reach the goal of "zero discharge" for the marine environment by 2005. To achieve this goal, the Norwegian oil and gas industry initiated the Zero Discharge Programme for discharges of produced formation water from the hydrocarbon-containing reservoir, in close communication with regulators. The environmental impact factor (EIF), a risk-based management tool, was developed by the industry to quantify and document the environmental risks from produced water discharges. The EIF represents a volume of recipient water containing concentrations of one or more substances to a level exceeding a generic threshold for ecotoxicological effects. In addition, this tool facilitates the identification and selection of cost-effective risk mitigation measures. The EIF tool has been used by all operators on the Norwegian continental shelf since 2002 to report progress toward the goal of "zero discharge," interpreted as "zero harmful discharges," to the regulators. Even though produced water volumes have increased by approximately 30% between 2002 and 2008 on the Norwegian continental shelf, the total environmental risk from produced water discharges expressed by the summed EIF for all installations has been reduced by approximately 55%. The total amount of oil discharged to the sea has been reduced by 18% over the period 2000 to 2006. The experience from the Zero Discharge Programme shows that a risk-based approach is an excellent working tool to reduce discharges of potential harmful substances from offshore oil and gas installations. PMID:21594986

  11. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    SciTech Connect

    Huq, M; Palta, J; Dunscombe, P; Thomadsen, B

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapy process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what

  12. Risk-based Inspection Scheduling Planning for Intelligent Agent in the Autonomous Fault Management

    SciTech Connect

    Hari Nugroho, Djoko; Sudarno

    2010-06-22

    This paper developed an autonomous fault management focusing to the inspection scheduling planning which was implemented to the advanced small nuclear reactor without on-site refuelling to assure the safety without human intervention. The inspection scheduling planning was developed optimally on the risk-based approach compromising between two important constraints related to the risk of action planning as such failure probability and shortest path. Performance was represented using computer simulation implemented to the DURESS components location and failure probability. It could be concluded that the first priority to be inspected was flow sensor FB2 which had the largest comparation value of 0.104233 comparing with the other components. The next route would be visited were sequentially FB1, FA2, FA1, FB, FA, VB, pump B, VA, pump A, VB2, VB1, VA2, VA1, reservoir 2, reservoir 1, FR2, and FR1. The movement route planning could be transferred to activate the robot arm which reflected as intelligent agent.

  13. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies. PMID:27062002

  14. National Drought Policy: Shifting the Paradigm from Crisis to Risk-based Management

    NASA Astrophysics Data System (ADS)

    Wilhite, D. A.; Sivakumar, M. K.; Stefanski, R.

    2011-12-01

    Drought is a normal part of climate for virtually all of the world's climatic regimes. To better address the risks associated with this hazard and societal vulnerability, there must be a dramatic paradigm shift in our approach to drought management in the coming decade in the light of the increasing frequency of droughts and projections of increased severity and duration of these events in the future for many regions, especially in the developing world. Addressing this challenge will require an improved awareness of drought as a natural hazard, the establishment of integrated drought monitoring and early warning systems, a higher level of preparedness that fully incorporates risk-based management, and the adoption of national drought policies that are directed at increasing the coping capacity and resilience of populations to future drought episodes. The World Meteorological Organization (WMO), in partnership with other United Nations' agencies, the National Drought Mitigation Center at the University of Nebraska, NOAA, the U.S. Department of Agriculture, and other partners, is currently launching a program to organize a High Level Meeting on National Drought Policy (HMNDP) in March 2013 to encourage the development of national drought policies through the development of a compendium of key policy elements. The key objectives of a national drought policy are to: (1) encourage vulnerable economic sectors and population groups to adopt self-reliant measures that promote risk management; (2) promote sustainable use of the agricultural and natural resource base; and (3) facilitate early recovery from drought through actions consistent with national drought policy objectives. The key elements of a drought policy framework are policy and governance, including political will; addressing risk and improving early warnings, including vulnerability analysis, impact assessment, and communication; mitigation and preparedness, including the application of effective and

  15. Toward a holistic and risk-based management of European river basins.

    PubMed

    Brack, Werner; Apitz, Sabine E; Borchardt, Dietrich; Brils, Jos; Cardoso, Ana Cristina; Foekema, Edwin M; van Gils, Jos; Jansen, Stefan; Harris, Bob; Hein, Michaela; Heise, Susanne; Hellsten, Seppo; de Maagd, P Gert-Jan; Müller, Dietmar; Panov, Vadim E; Posthuma, Leo; Quevauviller, Philippe; Verdonschot, Piet F M; von der Ohe, Peter C

    2009-01-01

    The European Union Water Framework Directive (WFD) requires a good chemical and ecological status of European surface waters by 2015. Integrated, risk-based management of river basins is presumed to be an appropriate approach to achieve that goal. The approach of focusing on distinct hazardous substances in surface waters together with investment in best available technology for treatment of industrial and domestic effluents was successful in significantly reducing excessive contamination of several European river basins. The use of the concept of chemical status in the WFD is based on this experience and focuses on chemicals for which there is a general agreement that they should be phased out. However, the chemical status, based primarily on a list of 33 priority substances and 8 priority hazardous substances, considers only a small portion of possible toxicants and does not address all causes of ecotoxicological stress in general. Recommendations for further development of this concept are 1) to focus on river basin-specific toxicants, 2) to regularly update priority lists with a focus on emerging toxicants, 3) to consider state-of-the-art mixture toxicity concepts and bioavailability to link chemical and ecological status, and 4) to add a short list of priority effects and to develop environmental quality standards for these effects. The ecological status reflected by ecological quality ratios is a leading principle of the WFD. While on the European scale the improvement of hydromorphological conditions and control of eutrophication are crucial to achieve a good ecological status, on a local and regional scale managers have to deal with multiple pressures. On this scale, toxic pollution may play an important role. Strategic research is necessary 1) to identify dominant pressures, 2) to predict multistressor effects, 3) to develop stressor- and type-specific metrics of pressures, and 4) to better understand the ecology of recovery. The concept of reference

  16. Seasonal Water Resources Management and Probabilistic Operations Forecast in the San Juan Basin

    NASA Astrophysics Data System (ADS)

    Daugherty, L.; Zagona, E. A.; Rajagopalan, B.; Grantz, K.; Miller, W. P.; Werner, K.

    2013-12-01

    within the NWS Community Hydrologic Prediction System (CHPS) to produce an ensemble streamflow forecast. The ensemble traces are used to drive the MTOM with the initial conditions of the water resources system and the operating rules, to provide ensembles of water resources management and operation metrics. We applied this integrated approach to forecasting in the San Juan River Basin (SJRB) using a portion of the Colorado River MTOM. The management objectives in the basin include water supply for irrigation, tribal water rights, environmental flows, and flood control. The spring streamflow ensembles were issued at four different lead times on the first of each month from January - April, and are incorporated into the MTOM for the period 2002-2010. Ensembles of operational performance metrics for the SJRB such as Navajo Reservoir releases, end of water year storage, environmental flows and water supply for irrigation were computed and their skills evaluated against variables obtained in a baseline simulation using historical streamflow. Preliminary results indicate that thus obtained probabilistic forecasts may produce increased skill especially at long lead time (e.g., on Jan and Feb 1st). The probabilistic information for water management variables provide risks of system vulnerabilities and thus enables risk-based efficient planning and operations.

  17. Dynamic Resource Management in Clouds: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Gonçalves, Paulo; Roy, Shubhabrata; Begin, Thomas; Loiseau, Patrick

    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this work we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. We show that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by “buzz/flash crowd effects” that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.

  18. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    USGS Publications Warehouse

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyere, Cindy L.; Newlon, Karen R.

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker (Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  19. A Risk-Based Approach to Evaluating Wildlife Demographics for Management in a Changing Climate: A Case Study of the Lewis's Woodpecker

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyère, Cindy L.; Newlon, Karen R.

    2012-12-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker ( Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  20. The future of monitoring in clinical research - a holistic approach: linking risk-based monitoring with quality management principles.

    PubMed

    Ansmann, Eva B; Hecht, Arthur; Henn, Doris K; Leptien, Sabine; Stelzer, Hans Günther

    2013-01-01

    Since several years risk-based monitoring is the new "magic bullet" for improvement in clinical research. Lots of authors in clinical research ranging from industry and academia to authorities are keen on demonstrating better monitoring-efficiency by reducing monitoring visits, monitoring time on site, monitoring costs and so on, always arguing with the use of risk-based monitoring principles. Mostly forgotten is the fact, that the use of risk-based monitoring is only adequate if all mandatory prerequisites at site and for the monitor and the sponsor are fulfilled.Based on the relevant chapter in ICH GCP (International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use - Good Clinical Practice) this publication takes a holistic approach by identifying and describing the requirements for future monitoring and the use of risk-based monitoring. As the authors are operational managers as well as QA (Quality Assurance) experts, both aspects are represented to come up with efficient and qualitative ways of future monitoring according to ICH GCP. PMID:23382708

  1. How to Quantify Sustainable Development: A Risk-Based Approach to Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  2. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  3. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. PMID:26734840

  4. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    NASA Astrophysics Data System (ADS)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  5. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  6. Risk based bridge data collection and asset management and the role of structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Bush, Simon; Henning, Theunis; McCarten, Peter

    2011-04-01

    Bridges are critical to the operation and functionality of the whole road networks. It is therefore essential that specific data is collected regarding bridge asset condition and performance, as this allows proactive management of the assets and associated risks and more accurate short and long term financial planning. This paper proposes and discusses a strategy for collection of data on bridge condition and performance. Recognizing that risk management is the primary driver of asset management, the proposed strategy prioritizes bridges for levels of data collection including core, intermediate and advanced. Individual bridges are seen as parts of wider networks and bridge risk and criticality assessment emphasizes bridge failure or underperformance risk in the network context. The paper demonstrates how more reliable and detailed data can assist in managing network and bridge risks and provides a rationale for application of higher data collection levels for bridges characterized by higher risk and criticality. As the bridge risk and/or criticality increases planned and proactive integration of structural health monitoring (SHM) data into asset management is outlined. An example of bridge prioritization for data collection using several bridges taken from a national highway network is provided using an existing risk and criticality scoring methodology. The paper concludes with a discussion on the role of SHM in data collection for bridge asset management and where SHM can make the largest impacts.

  7. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas. PMID:24869949

  8. Assistance to the states with risk based data management. Quarterly technical progress report, April 1--June 30, 1995

    SciTech Connect

    Paque, M.J.

    1995-07-28

    The Tasks of this project are to: (1) complete implementation of a Risk Based Data Management System (RBDMS) in the States of Alaska, Mississippi, Montana, Nebraska; and (2) conduct Area of Review (AOR) Workshops in the states of California, Oklahoma, Kansas, and Texas. The RBDMS was designed to be a comprehensive database with the ability to expand into multiple areas, including oil and gas production. The database includes comprehensive well information for both producing and injection wells. It includes automated features for performing functions redated to AOR analyses, environmental risk analyses, well evaluation, permit evaluation, compliance monitoring, operator bonding assessments, operational monitoring and tracking, and more. This quarterly report describes the status of the development of the RBDMS project in both stated tasks and proposes further steps in its implementation.

  9. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle. PMID:25294001

  10. A risk-based approach to managing active pharmaceutical ingredients in manufacturing effluent.

    PubMed

    Caldwell, Daniel J; Mertens, Birgit; Kappler, Kelly; Senac, Thomas; Journel, Romain; Wilson, Peter; Meyerhoff, Roger D; Parke, Neil J; Mastrocco, Frank; Mattson, Bengt; Murray-Smith, Richard; Dolan, David G; Straub, Jürg Oliver; Wiedemann, Michael; Hartmann, Andreas; Finan, Douglas S

    2016-04-01

    The present study describes guidance intended to assist pharmaceutical manufacturers in assessing, mitigating, and managing the potential environmental impacts of active pharmaceutical ingredients (APIs) in wastewater from manufacturing operations, including those from external suppliers. The tools are not a substitute for compliance with local regulatory requirements but rather are intended to help manufacturers achieve the general standard of "no discharge of APIs in toxic amounts." The approaches detailed in the present study identify practices for assessing potential environmental risks from APIs in manufacturing effluent and outline measures that can be used to reduce the risk, including selective application of available treatment technologies. These measures either are commonly employed within the industry or have been implemented to a more limited extent based on local circumstances. Much of the material is based on company experience and case studies discussed at an industry workshop held on this topic. PMID:26183919

  11. Development of applicating probabilistic long-term forecasts into water management

    NASA Astrophysics Data System (ADS)

    Hwang, Jin; Ryoo, Kyongsik; Suh, Aesook

    2016-04-01

    This research shows development of applicating probabilistic long-term forecasts into water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  12. Communicating uncertainty: managing the inherent probabilistic character of hazard estimates

    NASA Astrophysics Data System (ADS)

    Albarello, Dario

    2013-04-01

    Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific

  13. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  14. A Risk-Based Approach to Manage Nutrient Contamination From Household Wastewater

    NASA Astrophysics Data System (ADS)

    Gold, A. J.; Sims, J. T.

    2001-05-01

    Nutrients originating from decentralized wastewater treatment systems (DWTS) can pose a risk to human and ecosystem health. Assessing the likelihood and magnitude of this risk is a formidable and complex challenge. However, a properly constructed risk assessment is essential if we are to design and implement practices for DWTS that minimize the impacts of nutrients on our environment. To do this successfully, we must carefully consider: (i) the specific risks posed by nutrients emitted by DWTS and the sensitivity of humans and ecosystems to these risks; (ii) the pathways by which nutrients move from DWTS to the sectors of the environment where the risk will occur (most often ground and surface waters); (iii) the micro and macro-scale processes that affect the transport and transformations of nutrients once they are emitted from the DWTS and how this in turn affects risk; and (iv) the effects of current or alternative DWTS design and management practices on nutrient transport and subsequent risks to humans and ecosystems. In this paper we examine the risks of nutrients from DWTS to human and ecosystem health at both the micro and the macro?level spatial scales. We focus primarily on the factors that control the movement of N and P from DWTS to ground and surface waters and the research needs related to controlling nonpoint source nutrient pollution from DWTS. At the micro?scale the exposure pathways include the system and the immediate surroundings, i.e., the subsurface environment near the DWTS. The exposed individual or ecosystem at the micro-scale can be a household well, lake, stream or estuary that borders an individual wastewater treatment system. At the macro?level our focus is at the aquifer and watershed scale and the risks posed to downstream ecosystems and water users by nonpoint source pollution of these waters by nutrients from DWTS. We analyze what is known about the effectiveness of current designs at mitigating these risks and our ability to predict

  15. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  16. A two-stage inexact joint-probabilistic programming method for air quality management under uncertainty.

    PubMed

    Lv, Y; Huang, G H; Li, Y P; Yang, Z F; Sun, W

    2011-03-01

    A two-stage inexact joint-probabilistic programming (TIJP) method is developed for planning a regional air quality management system with multiple pollutants and multiple sources. The TIJP method incorporates the techniques of two-stage stochastic programming, joint-probabilistic constraint programming and interval mathematical programming, where uncertainties expressed as probability distributions and interval values can be addressed. Moreover, it can not only examine the risk of violating joint-probability constraints, but also account for economic penalties as corrective measures against any infeasibility. The developed TIJP method is applied to a case study of a regional air pollution control problem, where the air quality index (AQI) is introduced for evaluation of the integrated air quality management system associated with multiple pollutants. The joint-probability exists in the environmental constraints for AQI, such that individual probabilistic constraints for each pollutant can be efficiently incorporated within the TIJP model. The results indicate that useful solutions for air quality management practices have been generated; they can help decision makers to identify desired pollution abatement strategies with minimized system cost and maximized environmental efficiency. PMID:21067860

  17. Towards risk-based drought management in the Netherlands: making water supply levels transparent to water users

    NASA Astrophysics Data System (ADS)

    Maat Judith, Ter; Marjolein, Mens; Vuren Saskia, Van; der Vat Marnix, Van

    2016-04-01

    Improving Predictions and Management of Hydrological Extremes (IMPREX), running from 2016-2019, a consortium of the Dutch research institute Deltares and the Dutch water management consultant HKV will design and build a tool to support quantitative risk-informed decision-making for fresh water management for the Netherlands, in particular the decision on water supply service levels. The research will be conducted in collaboration with the Dutch Ministry for Infrastructure and Environment, the Freshwater Supply Programme Office, the Dutch governmental organisation responsible for water management (Rijkswaterstaat), the Foundation for Applied Water Research, (STOWA, knowledge centre of the water boards) and a number of water boards. In the session we will present the conceptual framework for a risk-based approach for water shortage management and share thoughts on how the proposed tool can be applied in the Dutch water management context.

  18. Mobile human network management and recommendation by probabilistic social mining.

    PubMed

    Min, Jun-Ki; Cho, Sung-Bae

    2011-06-01

    Recently, inferring or sharing of mobile contexts has been actively investigated as cell phones have become more than a communication device. However, most of them focused on utilizing the contexts on social network services, while the means in mining or managing the human network itself were barely considered. In this paper, the SmartPhonebook, which mines users' social connections to manage their relationships by reasoning social and personal contexts, is presented. It works like an artificial assistant which recommends the candidate callees whom the users probably would like to contact in a certain situation. Moreover, it visualizes their social contexts like closeness and relationship with others in order to let the users know their social situations. The proposed method infers the social contexts based on the contact patterns, while it extracts the personal contexts such as the users' emotional states and behaviors from the mobile logs. Here, Bayesian networks are exploited to handle the uncertainties in the mobile environment. The proposed system has been implemented with the MS Windows Mobile 2003 SE Platform on Samsung SPH-M4650 smartphone and has been tested on real-world data. The experimental results showed that the system provides an efficient and informative way for mobile social networking. PMID:21172755

  19. Use of probabilistic risk assessment (PRA) in expert systems to advise nuclear plant operators and managers

    SciTech Connect

    Uhrig, R.E.

    1988-01-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. Generally, expert systems rely on the expertise of human experts or knowledge that has been modified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)/sup 3/ of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. 5 refs., 1 fig., 2 tabs.

  20. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; Vesely, William; Youngblood, Robert

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  1. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  2. Probabilistic scenario-based water resource planning and management:A case study in the Yellow River Basin, China

    NASA Astrophysics Data System (ADS)

    Dong, C.; Schoups, G.; van de Giesen, N.

    2012-04-01

    Water resource planning and management is subject to large uncertainties with respect to the impact of climate change and socio-economic development on water systems. In order to deal with these uncertainties, probabilistic climate and socio-economic scenarios were developed based on the Principle of Maximum Entropy, as defined within information theory, and as inputs to hydrological models to construct probabilistic water scenarios using Monte Carlo simulation. Probabilistic scenarios provide more explicit information than equally-likely scenarios for decision-making in water resource management. A case was developed for the Yellow River Basin, China, where future water availability and water demand are affected by both climate change and socio-economic development. Climate scenarios of future precipitation and temperature were developed based on the results of multiple Global climate models; and socio-economic scenarios were downscaled from existing large-scale scenarios. Probability distributions were assigned to these scenarios to explicitly represent a full set of future possibilities. Probabilistic climate scenarios were used as input to a rainfall-runoff model to simulate future river discharge and socio-economic scenarios for calculating water demand. A full set of possible future water supply-demand scenarios and their associated probability distributions were generated. This set can feed the further analysis of the future water balance, which can be used as a basis to plan and manage water resources in the Yellow River Basin. Key words: Probabilistic scenarios, climate change, socio-economic development, water management

  3. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  4. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  5. Spatial probabilistic multi-criteria decision making for assessment of flood management alternatives

    NASA Astrophysics Data System (ADS)

    Ahmadisharaf, Ebrahim; Kalyanapu, Alfred J.; Chung, Eun-Sung

    2016-02-01

    Flood management alternatives are often evaluated on the basis of flood parameters such as depth and velocity. As these parameters are uncertain, so is the evaluation of the alternatives. It is thus important to incorporate the uncertainty of flood parameters into the decision making frameworks. This research develops a spatial probabilistic multi-criteria decision making (SPMCDM) framework to demonstrate the impact of the design rainfall uncertainty on evaluation of flood management alternatives. The framework employs a probabilistic rainfall-runoff transformation model, a two-dimensional flood model and a spatial MCDM technique. Thereby, the uncertainty of decision making can be determined alongside the best alternative. A probability-based map is produced to show the discrete probability distribution function (PDF) of selecting each competing alternative. Overall the best at each grid cell is the alternative with the mode parameter of this PDF. This framework is demonstrated on the Swannanoa River watershed in North Carolina, USA and its results are compared to those of deterministic approach. While the deterministic framework fails to provide the uncertainty of selecting an alternative, the SPMCDM framework showed that in overall, selection of flood management alternatives in the watershed is "moderately uncertain". Moreover, three comparison metrics, F fit measure, κ statistic, and Spearman rank correlation coefficient (ρ), are computed to compare the results of these two approaches. An F fit measure of 62.6%, κ statistic of 15.4-45.0%, and spatial mean ρ value of 0.48, imply a significant difference in decision making by incorporating the design rainfall uncertainty through the presented SPMCDM framework. The SPMCDM framework can help decision makers to understand the uncertainty in selection of flood management alternatives.

  6. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. PMID:23777564

  7. Risk-Based Information to Support the Evaluation of Management Options for Cesium and Strontium Capsules at the Hanford Site

    SciTech Connect

    MacDonell, M.; Peterson, J.; Picel, M.; Douglas Hildebrand, R.

    2008-07-01

    Evaluations are under way to support U.S. Department of Energy decisions on how to manage cesium and strontium capsules currently in storage at the Hanford site. Health-based exposure limits for drinking water, oral toxicity data, and environmental fate information were combined in an initial evaluation to frame performance targets for managing chemicals and radionuclides that could leach from the capsules and migrate to groundwater over time. More than 50 relevant benchmarks were identified for 15 of the 17 contaminants in the study set. Of those multiple benchmarks, EPA limits for drinking water served as the main basis for the leachate performance targets. For the remaining two contaminants, stable cesium and zirconium, preliminary indicators were derived from a limited review of toxicity data. Thus, preliminary candidate concentrations were identified for the full study set to support the ongoing evaluation of capsule management options. In summary: In an earlier scoping study, three radionuclides and eight chemicals were identified as contaminants of interest for leachate from cesium and strontium capsules stored at the Hanford site. To frame management options for these capsules, it is assumed that contaminants will leach to groundwater and serve as a drinking water source in the long-term future. Before developing performance targets for the initial set of contaminants, a combined fate and toxicity evaluation was conducted to determine if any others should be added to account for decay or fate products and chemical toxicity. From this review, the list was expanded to produce a final study set of 17 contaminants. Established exposure limits and toxicity data were then reviewed and integrated to develop candidate health-based concentrations to frame performance targets for assessing options for long-term capsule management. This review of more than a dozen different benchmarks and toxicity sources translated to hundreds of individual data checks to support the

  8. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  9. Inexact joint-probabilistic stochastic programming for water resources management under uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Y. P.; Huang, G. H.

    2010-11-01

    In this study, an inexact two-stage integer program with joint-probabilistic constraint (ITIP-JPC) is developed for supporting water resources management under uncertainty. This method can tackle uncertainties expressed as joint probabilities and interval values, and can reflect the reliability of satisfying (or the risk of violating) system constraints under uncertain events and/or parameters. Moreover, it can be used for analysing various policy scenarios that are associated with different levels of economic consequences when the pre-regulated targets are violated. The developed ITIP-JPC is applied to a case study of water resources allocation within a multi-stream, multi-reservoir and multi-user context, where joint probabilities exist in both water availabilities and storage capacities. The results indicate that reasonable solutions have been generated for both binary and continuous variables. They can help generate desired policies for water allocation and flood diversion with a maximized economic benefit and a minimized system-disruption risk.

  10. Groundwater contamination from waste management sites: The interaction between risk-based engineering design and regulatory policy: 1. Methodology

    NASA Astrophysics Data System (ADS)

    Massmann, Joel; Freeze, R. Allan

    1987-02-01

    This paper puts in place a risk-cost-benefit analysis for waste management facilities that explicitly recognizes the adversarial relationship that exists in a regulated market economy between the owner/operator of a waste management facility and the government regulatory agency under whose terms the facility must be licensed. The risk-cost-benefit analysis is set up from the perspective of the owner/operator. It can be used directly by the owner/operator to assess alternative design strategies. It can also be used by the regulatory agency to assess alternative regulatory policy, but only in an indirect manner, by examining the response of an owner/operator to the stimuli of various policies. The objective function is couched in terms of a discounted stream of benefits, costs, and risks over an engineering time horizon. Benefits are in the form of revenues for services provided; costs are those of construction and operation of the facility. Risk is defined as the cost associated with the probability of failure, with failure defined as the occurrence of a groundwater contamination event that violates the licensing requirements established for the facility. Failure requires a breach of the containment structure and contaminant migration through the hydrogeological environment to a compliance surface. The probability of failure can be estimated on the basis of reliability theory for the breach of containment and with a Monte-Carlo finite-element simulation for the advective contaminant transport. In the hydrogeological environment the hydraulic conductivity values are defined stochastically. The probability of failure is reduced by the presence of a monitoring network operated by the owner/operator and located between the source and the regulatory compliance surface. The level of reduction in the probability of failure depends on the probability of detection of the monitoring network, which can be calculated from the stochastic contaminant transport simulations. While

  11. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  12. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance

  13. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  14. Risk-based and deterministic regulation

    SciTech Connect

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose.

  15. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  16. Use Of Probabilistic Risk Assessment (PRA) In Expert Systems To Advise Nuclear Plant Operators And Managers

    NASA Astrophysics Data System (ADS)

    Uhrig, Robert E.

    1988-03-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. f2 Generally, expert systems rely on the expertise of human experts or knowledge that has been codified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)3 of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. Nuclear power plants have many redundant systems and can continue to operate when one or more of these systems is disabled or removed from service for maintenance or testing. PRAs provide a means of evaluating the risk to the public associated with the operation of nuclear power plants with components or systems out of service. While the choice of the "source term" and methodology in a PRA may influence the absolute probability and consequences of a core melt, the ratio of two PRA calculations for two configurations of the same plant, carried out on a consistent basis, can readily identify the increase in risk associated with going from one configuration to the other. PRISIM,4 a personal computer program to calculate the ratio of core melt probabilities described above (based on previously performed PRAs), has been developed under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC). When one or several components are removed from service, PRISM then calculates the ratio of the core melt probabilities. The inference engine of the expert system then uses this ratio and a constant risk criterion,5 along with information from its knowledge base (which includes information from the PRA), to advise plant personnel as to what action, if any, should be taken.

  17. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils. PMID:23807560

  18. State Assistance with Risk-Based Data Management: Inventory and needs assessment of 25 state Class II Underground Injection Control programs. Phase 1

    SciTech Connect

    Not Available

    1992-07-01

    As discussed in Section I of the attached report, state agencies must decide where to direct their limited resources in an effort to make optimum use of their available manpower and address those areas that pose the greatest risk to valuable drinking water sources. The Underground Injection Practices Research Foundation (UIPRF) proposed a risk-based data management system (RBDMS) to provide states with the information they need to effectively utilize staff resources, provide dependable documentation to justify program planning, and enhance environmental protection capabilities. The UIPRF structured its approach regarding environmental risk management to include data and information from production, injection, and inactive wells in its RBDMS project. Data from each of these well types is critical to the complete statistical evaluation of environmental risk and selected automated functions. This comprehensive approach allows state Underground Injection Control (UIC) programs to effectively evaluate the risk of contaminating underground sources of drinking water, while alleviating the additional work and associated problems that often arise when separate data bases are used. CH2M Hill and Digital Design Group, through a DOE grant to the UIPRF, completed an inventory and needs assessment of 25 state Class II UIC programs. The states selected for participation by the UIPRF were generally chosen based on interest and whether an active Class II injection well program was in place. The inventory and needs assessment provided an effective means of collecting and analyzing the interest, commitment, design requirements, utilization, and potential benefits of implementing a in individual state UIC programs. Personal contacts were made with representatives from each state to discuss the applicability of a RBDMS in their respective state.

  19. The probabilistic seismic loss model as a tool for portfolio management: the case of Maghreb.

    NASA Astrophysics Data System (ADS)

    Pousse, Guillaume; Lorenzo, Francisco; Stejskal, Vladimir

    2010-05-01

    Although property insurance market in Maghreb countries does not systematically purchase an earthquake cover, Impact Forecasting is developing a new loss model for the calculation of probabilistic seismic risk. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Then, a set of damage functions is used to convert the modelled ground motion severity into monetary losses. We aim to highlight risk assessment challenges, especially in countries where reliable data are difficult to obtain. The loss model estimates the risk and allows discussing further risk transfer strategies.

  20. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  1. Risk-based configuration control: Application of PSA in improving technical specifications and operational safety

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Vesely, W.E.

    1992-11-01

    Risk-based configuration control is the management of component configurations using a risk perspective to control risk and assure safety. A configuration, as used here, is a set of component operability statuses that define the state of a nuclear power plant. If the component configurations that have high risk implications do not occur, then the risk from the operation of nuclear power plants would be minimal. The control of component configurations, i.e., the management of component statuses, to minimize the risk from components being unavailable, becomes difficult, because the status of a standby safety system component is often not apparent unless it is tested. Controlling plant configuration from a risk-perspective can provide more direct risk control and also more operational flexibility by allowing looser controls in areas unimportant to risk. Risk-based configuration control approaches can be used to replace parts of nuclear power plant Technical Specifications. With the advances in probabilistic safety assessment (PSA) technology, such approaches to improve Technical Specifications and operational safety are feasible. In this paper, we present an analysis of configuration risks, and a framework for risk-based configuration control to achieve the desired control of risk-significant configurations during plant operation.

  2. Risk-based configuration control: Application of PSA in improving technical specifications and operational safety

    SciTech Connect

    Samanta, P.K.; Kim, I.S. ); Vesely, W.E. )

    1992-01-01

    Risk-based configuration control is the management of component configurations using a risk perspective to control risk and assure safety. A configuration, as used here, is a set of component operability statuses that define the state of a nuclear power plant. If the component configurations that have high risk implications do not occur, then the risk from the operation of nuclear power plants would be minimal. The control of component configurations, i.e., the management of component statuses, to minimize the risk from components being unavailable, becomes difficult, because the status of a standby safety system component is often not apparent unless it is tested. Controlling plant configuration from a risk-perspective can provide more direct risk control and also more operational flexibility by allowing looser controls in areas unimportant to risk. Risk-based configuration control approaches can be used to replace parts of nuclear power plant Technical Specifications. With the advances in probabilistic safety assessment (PSA) technology, such approaches to improve Technical Specifications and operational safety are feasible. In this paper, we present an analysis of configuration risks, and a framework for risk-based configuration control to achieve the desired control of risk-significant configurations during plant operation.

  3. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  4. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  5. Coastal cliff recession: the use of probabilistic prediction methods

    NASA Astrophysics Data System (ADS)

    Lee, E. M.; Hall, J. W.; Meadowcroft, I. C.

    2001-10-01

    A range of probabilistic methods is introduced for predicting coastal cliff recession, which provide a means of demonstrating the potential variability in such predictions. They form the basis for risk-based land-use planning, cliff management and engineering decision-making. Examples of probabilistic models are presented for a number of different cliff settings: the simulation of recession on eroding cliffs; the use of historical records and statistical experiments to model the behaviour of cliffs affected by rare, episodic landslide events; the adaptation of an event tree approach to assess the probability of failure of protected cliffs, taking into account the residual life of the existing defences; and the evaluation of the probability of landslide reactivation in areas of pre-existing landslide systems. These methods are based on a geomorphological assessment of the episodic nature of the recession process, together with historical records.

  6. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  7. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  8. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  9. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  10. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  11. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  12. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  13. A risk-based focused decision-management approach for justifying characterization of Hanford tank waste. June 1996, Revision 1; April 1997, Revision 2

    SciTech Connect

    Colson, S.D.; Gephart, R.E.; Hunter, V.L.; Janata, J.; Morgan, L.G.

    1997-12-31

    This report describes a disciplined, risk-based decision-making approach for determining characterization needs and resolving safety issues during the storage and remediation of radioactive waste stored in Hanford tanks. The strategy recommended uses interactive problem evaluation and decision analysis methods commonly used in industry to solve problems under conditions of uncertainty (i.e., lack of perfect knowledge). It acknowledges that problem resolution comes through both the application of high-quality science and human decisions based upon preferences and sometimes hard-to-compare choices. It recognizes that to firmly resolve a safety problem, the controlling waste characteristics and chemical phenomena must be measurable or estimated to an acceptable level of confidence tailored to the decision being made.

  14. Improving nutrient management practices in agriculture: The role of risk-based beliefs in understanding farmers' attitudes toward taking additional action

    NASA Astrophysics Data System (ADS)

    Wilson, Robyn S.; Howard, Gregory; Burnett, Elizabeth A.

    2014-08-01

    A recent increase in the amount of dissolved reactive phosphorus (DRP) entering the western Lake Erie basin is likely due to increased spring storm events in combination with issues related to fertilizer application and timing. These factors in combination with warmer lake temperatures have amplified the spread of toxic algal blooms. We assessed the attitudes of farmers in northwest Ohio toward taking at least one additional action to reduce nutrient loss on their farm. Specifically, we (1) identified to what extent farm and farmer characteristics (e.g., age, gross farm sales) as well as risk-based beliefs (e.g., efficacy, risk perception) influenced attitudes, and (2) assessed how these characteristics and beliefs differ in their predictive ability based on unobservable latent classes of farmers. Risk perception, or a belief that negative impacts to profit and water quality from nutrient loss were likely, was the most consistent predictor of farmer attitudes. Response efficacy, or a belief that taking action on one's farm made a difference, was found to significantly influence attitudes, although this belief was particularly salient for the minority class of farmers who were older and more motivated by profit. Communication efforts should focus on the negative impacts of nutrient loss to both the farm (i.e., profit) and the natural environment (i.e., water quality) to raise individual perceived risk among the majority, while the minority need higher perceived efficacy or more specific information about the economic effectiveness of particular recommended practices.

  15. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  16. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  17. A generic probabilistic framework for structural health prognostics and uncertainty management

    NASA Astrophysics Data System (ADS)

    Wang, Pingfeng; Youn, Byeng D.; Hu, Chao

    2012-04-01

    Structural health prognostics can be broadly applied to various engineered artifacts in an engineered system. However, techniques and methodologies for health prognostics become application-specific. This study thus aims at formulating a generic framework of structural health prognostics, which is composed of four core elements: (i) a generic health index system with synthesized health index (SHI), (ii) a generic offline learning scheme using the sparse Bayes learning (SBL) technique, (iii) a generic online prediction scheme using the similarity-based interpolation (SBI), and (iv) an uncertainty propagation map for the prognostic uncertainty management. The SHI enables the use of heterogeneous sensory signals; the sparseness feature employing only a few neighboring kernel functions enables the real-time prediction of remaining useful lives (RULs) regardless of data size; the SBI predicts the RULs with the background health knowledge obtained under uncertain manufacturing and operation conditions; and the uncertainty propagation map enables the predicted RULs to be loaded with their statistical characteristics. The proposed generic framework of structural health prognostics is thus applicable to different engineered systems and its effectiveness is demonstrated with two cases studies.

  18. Converting probabilistic tree species range shift projections into meaningful classes for management.

    PubMed

    Hanewinkel, Marc; Cullmann, Dominik A; Michiels, Hans-Gerd; Kändler, Gerald

    2014-02-15

    The paper deals with the management problem how to decide on tree species suitability under changing environmental conditions. It presents an algorithm that classifies the output of a range shift model for major tree species in Europe into multiple classes that can be linked to qualities characterizing the ecological niche of the species. The classes: i) Core distribution area, ii) Extended distribution area, iii) Occasional occurrence area, and iv) No occurrence area are first theoretically developed and then statistically described. The classes are interpreted from an ecological point of view using criteria like population structure, competitive strength, site spectrum and vulnerability to biotic hazards. The functioning of the algorithm is demonstrated using the example of a generalized linear model that was fitted to a pan-European dataset of presence/absence of major tree species with downscaled climate data from a General Circulation Model (GCM). Applications of the algorithm to tree species suitability classification on a European and regional level are shown. The thresholds that are used by the algorithm are precision-based and include Cohen's Kappa. A validation of the algorithm using an independent dataset of the German National Forest Inventory shows good accordance of the statistically derived classes with ecological traits for Norway spruce, while the differentiation especially between core and extended distribution for European beech that is in the centre of its natural range in this area is less accurate. We hypothesize that for species in the core of their range regional factors like forest history superimpose climatic factors. Problems of uncertainty issued from potentially applying a multitude of modelling approaches and/or climate realizations within the range shift model are discussed and a way to deal with the uncertainty by revealing the underlying attitude towards risk of the decision maker is proposed. PMID:24486469

  19. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  20. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  1. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  2. Overview of the co-ordinated risk-based approach to science and management response and recovery for the 2012 eruptions of Tongariro volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, G. E.; Keys, H. J. R.; Procter, J. N.; Deligne, N. I.

    2014-10-01

    Tongariro volcano, New Zealand, lies wholly within the Tongariro National Park (TNP), one of New Zealand's major tourist destinations. Two small eruptions of the Te Maari vents on the northern flanks of Tongariro on 6 August 2012 and 21 November 2012 each produced a small ash cloud to < 8 km height accompanied by pyroclastic density currents and ballistic projectiles. The most popular day hike in New Zealand, the Tongariro Alpine Crossing (TAC), runs within 2 km of the Te Maari vents. The larger of the two eruptions (6 August 2012) severely impacted the TAC and resulted in its closure, impacting the local economic and potentially influencing national tourism. In this paper, we document the science and risk management response to the eruption, and detail how quantitative risk assessments were applied in a rapidly evolving situation to inform robust decision-making for when the TAC would be re-opened. The volcanologist and risk manager partnership highlights the value of open communication between scientists and stakeholders during a response to, and subsequent recovery from, a volcanic eruption.

  3. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    SciTech Connect

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  4. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  5. Risk-based system refinement

    SciTech Connect

    Winter, V.L.; Berg, R.S.; Dalton, L.J.

    1998-06-01

    When designing a high consequence system, considerable care should be taken to ensure that the system can not easily be placed into a high consequence failure state. A formal system design process should include a model that explicitly shows the complete state space of the system (including failure states) as well as those events (e.g., abnormal environmental conditions, component failures, etc.) that can cause a system to enter a failure state. In this paper the authors present such a model and formally develop a notion of risk-based refinement with respect to the model.

  6. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  7. Risk Based Security Management at Research Reactors

    SciTech Connect

    Ek, David R.

    2015-09-01

    This presentation provides a background of what led to the international emphasis on nuclear security and describes how nuclear security is effectively implemented so as to preserve the societal benefits of nuclear and radioactive materials.

  8. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  9. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  10. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  11. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  12. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering. PMID:25500464

  13. Risk Based Inspection Pilot Study of Ignalina Nuclear Power Plant,Unit 2

    SciTech Connect

    Brickstad, Bjorn; Letzter, Adam; Klimasauskas, Arturas; Alzbutas, Robertas; Nedzinskas, Linas; Kopustinskas, Vytis

    2002-07-01

    A project with the acronym IRBIS (Ignalina Risk Based Inspection pilot Study) has been performed with the objective to perform a quantitative risk analysis of a total of 1240 stainless steel welds in Ignalina Nuclear Power Plant, unit 2 (INPP-2). The damage mechanism is IGSCC and the failure probabilities are quantified by using probabilistic fracture mechanics. The conditional core damage probabilities are taken from the plant PSA. (authors)

  14. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  15. Risk based inspection for atmospheric storage tank

    NASA Astrophysics Data System (ADS)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  16. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  17. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology. PMID:26922500

  18. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties. PMID:25983196

  19. Potential advantages associated with implementing a risk-based inspection program by a nuclear facility

    NASA Astrophysics Data System (ADS)

    McNeill, Alexander, III; Balkey, Kenneth R.

    1995-05-01

    The current inservice inspection activities at a U.S. nuclear facility are based upon the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI. The Code selects examination locations based upon a sampling criteria which includes component geometry, stress, and usage among other criteria. This can result in a significant number of required examinations. As a result of regulatory action each nuclear facility has conducted probabilistic risk assessments (PRA) or individual plant examinations (IPE), producing plant specific risk-based information. Several initiatives have been introduced to apply this new plant risk information. Among these initiatives is risk-based inservice inspection. A code case has been introduced for piping inspections based upon this new risk- based technology. This effort brought forward to the ASME Section XI Code committee, has been initiated and championed by the ASME Research Task Force on Risk-Based Inspection Guidelines -- LWR Nuclear Power Plant Application. Preliminary assessments associated with the code case have revealed that potential advantages exist in a risk-based inservice inspection program with regard to a number of exams, risk, personnel exposure, and cost.

  20. Risk-based inservice testing program modifications at Palo Verde nuclear generating station

    SciTech Connect

    Knauf, S.; Lindenlaub, B.; Linthicum, R.

    1996-12-01

    Arizona Public Service Company (APS) is investigating changes to the Palo Verde Inservice Testing (IST) Program that are intended to result in the reduction of the required test frequency for various valves in the American Society of Mechanical Engineers (ASME) Section XI IST program. The analytical techniques employed to select candidate valves and to demonstrate that these frequency reductions are acceptable are risk based. The results of the Palo Verde probabilistic risk assessment (PRA), updated in June 1994, and the risk significant determination performed as part of the implementation efforts for 10 CFR 50.65 (the maintenance rule) were used to select candidate valves for extended test intervals. Additional component level evaluations were conducted by an `expert panel.` The decision to pursue these changes was facilitated by the ASME Risk-Based Inservice Testing Research Task Force for which Palo Verde is participating as a pilot plant. The NRC`s increasing acceptance of cost beneficial licensing actions and risk-based submittals also provided incentive to seek these changes. Arizona Public Service is pursuing the risk-based IST program modification in order to reduce the unnecessary regulatory burden of the IST program through qualitative and quantitative analysis consistent with maintaining a high level of plant safety. The objectives of this project at Palo Verde are as follows: (1) Apply risk-based technologies to IST components to determine their risk significance (i.e., high or low). (2) Apply a combination of deterministic and risk-based methods to determine appropriate testing requirements for IST components including improvement of testing methods and frequency intervals for high-risk significant components. (3) Apply risk-based technologies to high-risk significant components identified by the {open_quotes}expert panel{close_quotes} and outside of the IST program to determine whether additional testing requirements are appropriate.

  1. A probabilistic and multi-objective conceptual design methodology for the evaluation of thermal management systems on air-breathing hypersonic vehicles

    NASA Astrophysics Data System (ADS)

    Ordaz, Irian

    This thesis addresses the challenges associated with thermal management systems (TMS) evaluation and selection in the conceptual design of hypersonic, air-breathing vehicles with sustained cruise. The proposed methodology identifies analysis tools and techniques which allow the proper investigation of the design space for various thermal management technologies. The design space exploration environment and alternative multi-objective decision making technique defined as Pareto-based Joint Probability Decision Making (PJPDM) is based on the approximation of 3-D Pareto frontiers and probabilistic technology effectiveness maps. These are generated through the evaluation of a Pareto Fitness function and Monte Carlo analysis. In contrast to Joint Probability Decision Making (JPDM), the proposed PJPDM technique does not require preemptive knowledge of weighting factors for competing objectives or goal constraints which can introduce bias into the final solution. Preemptive bias in a complex problem can degrade the overall capabilities of the final design. The implementation of PJPDM in this thesis eliminates the need for the numerical optimizer which is required with JPDM in order to improve upon a solution. In addition, a physics-based formulation is presented for the quantification of TMS safety effectiveness corresponding to debris impact/damage and how it can be applied towards risk mitigation. Lastly, a formulation loosely based on non-preemptive Goal Programming with equal weighted deviations is provided for the resolution of the inverse design space. This key step helps link vehicle capabilities to TMS technology subsystems in a top-down design approach. The methodology provides the designer more knowledge up front to help make proper engineering decisions and assumptions in the conceptual design phase regarding which technologies show greatest promise, and how to guide future technology research.

  2. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  3. Probabilistic comparison of alternative characterization technologies at the Fernald Uranium-in-Soils Integrated Demonstration Project

    SciTech Connect

    Rautman, C.A.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.; Kaplan, P.G.

    1993-12-31

    The performance of four alternative characterization technologies proposed for use in characterization of surficial uranium contamination in soil at the Incinerator and Drum Baling Areas at the Fernald Environmental Management Project in southwestern Ohio has been evaluated using a probabilistic, risk-based decision-analysis methodology. The basis of comparison is to minimize a computed total cost for environmental cleanup. This total-cost-based approach provides a framework for evaluating the trade-offs among remedial investigation, the remedial design, and the risk of regulatory penalties. The approach explicitly recognizes the value of information provided by remedial investigation; additional measurements are only valuable to the extent that the information they provide reduces total cost.

  4. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  5. Challenges in using probabilistic climate change information for impact assessments: an example from the water sector.

    PubMed

    New, Mark; Lopez, Ana; Dessai, Suraje; Wilby, Rob

    2007-08-15

    Climate change impacts and adaptation assessments have traditionally adopted a scenario-based approach, which precludes an assessment of the relative risks of particular adaptation options. Probabilistic impact assessments, especially if based on a thorough analysis of the uncertainty in an impact forecast system, enable adoption of a risk-based assessment framework. However, probabilistic impacts information is conditional and will change over time. We explore the implications of a probabilistic end-to-end risk-based framework for climate impacts assessment, using the example of water resources in the Thames River, UK. We show that a probabilistic approach provides more informative results that enable the potential risk of impacts to be quantified, but that details of the risks are dependent on the approach used in the analysis. PMID:17569650

  6. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  7. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  10. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  11. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  12. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  13. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  14. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  15. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  16. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  17. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  18. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  19. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  20. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis. PMID:9433667

  1. 12 CFR 1750.13 - Risk-based capital level computation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... section to the Enterprise. (2) Management and Operations Risk. To provide for management and operations... Section 1750.13 Banks and Banking OFFICE OF FEDERAL HOUSING ENTERPRISE OVERSIGHT, DEPARTMENT OF HOUSING... Enterprise at least quarterly by applying the risk-based capital test described in appendix A to this...

  2. Application of risk-based methods to inservice inspection of piping systems

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; Oswald, E.; West, R.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice inspection (ISI) programs for piping systems in nuclear power plants. This paper discusses a pilot application of these methods to the inservice inspection of piping systems of Northeast Utilities Millstone Unit 3 nuclear power station. This demonstration study, which has been sponsored by the Westinghouse Owners Group (WOG), applies probabilistic safety assessment (PSA) models that have already been developed to meet regulatory requirements for an individual plant examination (IPE). The approach calculates the relative importance for each component within the systems of interest. This risk-importance is based on the frequency of core damage resulting from the structural failure of the component. The process inductively determines the effects that such failures have on the desired operational characteristics of the system being analyzed. Structural reliability/risk assessment (SRRA) models based on probabilistic structural mechanics methods are used to estimate failure probabilities for important components. Locations within a system with varying failure probabilities can be defined to focus ISI resources. This paper will discuss the above process and results to show that application of risk-based methods in the development of ISI programs can potentially result in significant savings while maintaining a high level of safety.

  3. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  4. Expert system development for probabilistic load simulation

    NASA Technical Reports Server (NTRS)

    Ho, H.; Newell, J. F.

    1991-01-01

    A knowledge based system LDEXPT using the intelligent data base paradigm was developed for the Composite Load Spectra (CLS) project to simulate the probabilistic loads of a space propulsion system. The knowledge base approach provides a systematic framework of organizing the load information and facilitates the coupling of the numerical processing and symbolic (information) processing. It provides an incremental development environment for building generic probabilistic load models and book keeping the associated load information. A large volume of load data is stored in the data base and can be retrieved and updated by a built-in data base management system. The data base system standardizes the data storage and retrieval procedures. It helps maintain data integrity and avoid data redundancy. The intelligent data base paradigm provides ways to build expert system rules for shallow and deep reasoning and thus provides expert knowledge to help users to obtain the required probabilistic load spectra.

  5. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  6. Performance- and risk-based regulation

    SciTech Connect

    Sauter, G.D.

    1994-12-31

    Risk-based regulation (RBR) and performance-based regulation (PBR) are two relatively new concepts for the regulation of nuclear reactor power plants by the U.S. Nuclear Regulatory Commission (NRC). Although RBR and PBR are often considered to be somewhat equivalent, they, in fact, address two fundamentally different regulatory questions. To fruitfully discuss these two concepts, it is important to recognize what each entails. This paper identifies those two fundamental questions and discusses how they are addressed by RBR and PBR.

  7. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  8. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  9. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  10. Risk-based monitored natural attenuation--a case study.

    PubMed

    Khan, F I; Husain, T

    2001-08-17

    The term "monitored natural attenuation" (MNA) refers to a reliance on natural attenuation (NA) processes for remediation through the careful monitoring of the behavior of a contaminant source in time and space domains. In recent years, policymakers are shifting to a risk-based approach where site characteristics are measured against the potential risk to human health and the environment, and site management strategies are prioritized to be commensurate with that risk. Risk-based corrective action (RBCA), a concept developed by the American Society for Testing Materials (ASTM), was the first indication of how this approach could be used in the development of remediation strategies. This paper, which links ASTM's RBCA approach with MNA, develops a systematic working methodology for a risk-based site evaluation and remediation through NA. The methodology is comprised of seven steps, with the first five steps intended to evaluate site characteristics and the feasibility of NA. If NA is effective, then the last two steps will guide the development of a long-term monitoring plan and approval for a site closure. This methodology is used to evaluate a site contaminated with oil from a pipeline spill. The case study concluded that the site has the requisite characteristics for NA, but it would take more than 80 years for attenuation of xylene and ethylbenzene, as these chemicals appear in the pure phase. If fast remediation is sought, then efforts should be made to remove the contaminant from the soil. Initially, the site posed a serious risk to both on-site and off-site receptors, but it becomes acceptable after 20 years, as the plume is diluted and drifts from its source of origin. PMID:11489527

  11. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  12. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  13. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  14. Probabilistic Approaches: Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process through composite mechanics, and structural component. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength. For example, results show that: in situ fiber tensile strength is 90 percent of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables; a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide spread scatter at 90 percent cyclic-stress to static-strength ratios.

  15. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  16. Probabilistic composite analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.

    1991-01-01

    Formal procedures are described which are used to computationally simulate the probabilistic behavior of composite structures. The computational simulation starts with the uncertainties associated with all aspects of a composite structure (constituents, fabrication, assembling, etc.) and encompasses all aspects of composite behavior (micromechanics, macromechanics, combined stress failure, laminate theory, structural response, and tailoring) optimization. Typical cases are included to illustrate the formal procedure for computational simulation. The collective results of the sample cases demonstrate that uncertainties in composite behavior and structural response can be probabilistically quantified.

  17. Risk-based targeting: A new approach in environmental protection

    SciTech Connect

    Fox, C.A.

    1995-12-31

    Risk-based targeting has recently emerged as an effective tool to help prioritize efforts to identify and manage geographic areas, chemicals, facilities, and agricultural activities that cause the most environmental degradation. This paper focuses on how the Environmental Protection Agency (EPA) has recently used risk-based targeting to identify and screen Federal, industrial, commercial and municipal facilities which contribute to probable human health (fish consumption advisories and contaminated fish tissue) and aquatic life (contaminated sediments) impacts. Preliminary results identified several hundred potential contributors of problem chemicals to probable impacts within the same river reach in 1991--93. Analysis by industry sector showed that the majority of the facilities identified were publicly owned treatment works (POTWs), in addition to industry organic and inorganic chemical manufacturers, petroleum refineries, and electric services, coatings, engravings, and allied services, among others. Both compliant and non-compliant potentially contributing facilities were identified to some extent in all EPA regions. Additional results identifying possible linkages of other pollutant sources to probable impacts, as well as estimation of potential exposure of these contaminants to minority and/or poverty populations are also presented. Out of these analyses, a number of short and long-term strategies are being developed that EPA may use to reduce loadings of problem contaminants to impacted waterbodies.

  18. Study of operational risk-based configuration control

    SciTech Connect

    Vesely, W E; Samanta, P K; Kim, I S

    1991-08-01

    This report studies aspects of a risk-based configuration control system to detect and control plant configurations from a risk perspective. Configuration control, as the term is used here, is the management of component configurations to achieve specific objectives. One important objective is to control risk and safety. Another is to operate efficiently and make effective use of available resources. PSA-based evaluations are performed to study configuration to core-melt frequency and core-melt probability for two plants. Some equipment configurations can cause large core-melt frequency and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the core-melt probability contributions are also generally small. The insights from this evaluation are used to develop the framework for an effective risk-based configuration control system. The focal points of such a system and the requirements for tools development for implementing the system are defined. The requirements of risk models needed for the system, and the uses of plant-specific data are also discussed. 18 refs., 25 figs., 10 tabs.

  19. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  20. Probabilistic, Multidimensional Unfolding Analysis

    ERIC Educational Resources Information Center

    Zinnes, Joseph L.; Griggs, Richard A.

    1974-01-01

    Probabilistic assumptions are added to single and multidimensional versions of the Coombs unfolding model for preferential choice (Coombs, 1950) and practical ways of obtaining maximum likelihood estimates of the scale parameters and goodness-of-fit tests of the model are presented. A Monte Carlo experiment is discussed. (Author/RC)

  1. Concepts for risk-based surveillance in the field of veterinary medicine and veterinary public health: Review of current approaches

    PubMed Central

    Stärk, Katharina DC; Regula, Gertraud; Hernandez, Jorge; Knopf, Lea; Fuchs, Klemens; Morris, Roger S; Davies, Peter

    2006-01-01

    Background Emerging animal and zoonotic diseases and increasing international trade have resulted in an increased demand for veterinary surveillance systems. However, human and financial resources available to support government veterinary services are becoming more and more limited in many countries world-wide. Intuitively, issues that present higher risks merit higher priority for surveillance resources as investments will yield higher benefit-cost ratios. The rapid rate of acceptance of this core concept of risk-based surveillance has outpaced the development of its theoretical and practical bases. Discussion The principal objectives of risk-based veterinary surveillance are to identify surveillance needs to protect the health of livestock and consumers, to set priorities, and to allocate resources effectively and efficiently. An important goal is to achieve a higher benefit-cost ratio with existing or reduced resources. We propose to define risk-based surveillance systems as those that apply risk assessment methods in different steps of traditional surveillance design for early detection and management of diseases or hazards. In risk-based designs, public health, economic and trade consequences of diseases play an important role in selection of diseases or hazards. Furthermore, certain strata of the population of interest have a higher probability to be sampled for detection of diseases or hazards. Evaluation of risk-based surveillance systems shall prove that the efficacy of risk-based systems is equal or higher than traditional systems; however, the efficiency (benefit-cost ratio) shall be higher in risk-based surveillance systems. Summary Risk-based surveillance considerations are useful to support both strategic and operational decision making. This article highlights applications of risk-based surveillance systems in the veterinary field including food safety. Examples are provided for risk-based hazard selection, risk-based selection of sampling strata as

  2. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  3. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  4. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  5. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  6. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  7. Probabilistic simple splicing systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2014-06-01

    A splicing system, one of the early theoretical models for DNA computing was introduced by Head in 1987. Splicing systems are based on the splicing operation which, informally, cuts two strings of DNA molecules at the specific recognition sites and attaches the prefix of the first string to the suffix of the second string, and the prefix of the second string to the suffix of the first string, thus yielding the new strings. For a specific type of splicing systems, namely the simple splicing systems, the recognition sites are the same for both strings of DNA molecules. It is known that splicing systems with finite sets of axioms and splicing rules only generate regular languages. Hence, different types of restrictions have been considered for splicing systems in order to increase their computational power. Recently, probabilistic splicing systems have been introduced where the probabilities are initially associated with the axioms, and the probabilities of the generated strings are computed from the probabilities of the initial strings. In this paper, some properties of probabilistic simple splicing systems are investigated. We prove that probabilistic simple splicing systems can also increase the computational power of the splicing languages generated.

  8. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  9. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  10. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  11. Risk-Based Data Management System design specifications and implementation plan for the Alaska Oil and Gas Conservation Commission; the Mississippi State Oil and Gas Board; the Montana Board of Oil and Gas Conservation; and the Nebraska Oil and Gas Conservation Commission

    SciTech Connect

    Not Available

    1993-09-01

    The purpose of this document is to present design specifications and an implementation schedule for the development and implementation of Risk Based Data Management Systems (RBDMS`s) in the states of Alaska, Mississippi, Montana, and Nebraska. The document presents detailed design information including a description of the system database structure, data dictionary, data entry and inquiry screen layouts, specifications for standard reports that will be produced by the system, functions and capabilities (including environmental risk analyses), And table relationships for each database table within the system. This design information provides a comprehensive blueprint of the system to be developed and presents the necessary detailed information for system development and implementation. A proposed schedule for development and implementation also is presented. The schedule presents timeframes for the development of system modules, training, implementation, and providing assistance to the states with data conversion from existing systems. However, the schedule will vary depending upon the timing of funding allocations from the United States Department of Energy (DOE) for the development and implementation phase of the project. For planning purposes, the schedule assumes that initiation of the development and implementation phase will commence November 1, 1993, somewhat later than originally anticipated.

  12. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  13. Probabilistic analysis of mechanical systems

    SciTech Connect

    Priddy, T.G.; Paez, T.L.; Veers, P.S.

    1993-09-01

    This paper proposes a framework for the comprehensive analysis of complex problems in probabilistic structural mechanics. Tools that can be used to accurately estimate the probabilistic behavior of mechanical systems are discussed, and some of the techniques proposed in the paper are developed and used in the solution of a problem in nonlinear structural dynamics.

  14. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  15. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  16. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  17. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    NASA Astrophysics Data System (ADS)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  18. Probabilistic Finite Element: Variational Theory

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.

    1985-01-01

    The goal of this research is to provide techniques which are cost-effective and enable the engineer to evaluate the effect of uncertainties in complex finite element models. Embedding the probabilistic aspects in a variational formulation is a natural approach. In addition, a variational approach to probabilistic finite elements enables it to be incorporated within standard finite element methodologies. Therefore, once the procedures are developed, they can easily be adapted to existing general purpose programs. Furthermore, the variational basis for these methods enables them to be adapted to a wide variety of structural elements and to provide a consistent basis for incorporating probabilistic features in many aspects of the structural problem. Tasks concluded include the theoretical development of probabilistic variational equations for structural dynamics, the development of efficient numerical algorithms for probabilistic sensitivity displacement and stress analysis, and integration of methodologies into a pilot computer code.

  19. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  20. Risk-based inspection of pressurizer surge lines

    NASA Astrophysics Data System (ADS)

    Shah, Nitin J.; Dwivedy, Keshab K.

    1996-11-01

    The Reactor Coolant System (RCS) piping of a pressurized water reactor (PWR) plant is probably the best in terms of resistance to known degradation mechanisms of passive components. However, a failure in the RCS piping is extremely important in terms of safety and economic significance. Therefore, an effective management tool is needed to mitigate the potential effects of degradation due to aging or other effects such that plant reliability and availability are not affected. Currently, the RCS piping of all US PWR plants is being subjected to inservice inspection (ISI) based upon certain deterministics criteria set by the ASME code and the NRC regulatory guide. Even though the history of large RCS piping has not shown any degradation, the ISI continues at many locations at greta expense to the plant owners whereas, there can be only a few locations of relatively high vulnerability. A risk based ISI can provide an alternative and cost-effective solution in this situation. Pressurizer surge line is a unique segment in the RCS which is subjected to significant transient loadings due to stratification and striping during the normal heatup and cooldown processes. Therefore, the surge line is considered for illustration. Examples of structural reliability studies of pressurizer surge lines in four PWR units are presented in this paper to demonstrate possible reduction of ISI and significant cost saving without reduction of plant safety or reliability.

  1. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  2. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  3. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  4. Probabilistic fracture finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-01-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  5. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  6. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  7. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  8. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  9. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  10. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  11. Do probabilistic forecasts lead to better decisions? Try it yourself!

    NASA Astrophysics Data System (ADS)

    van Andel, S. J.; Pappenberger, F.; Ramos, M. H.; Thielen, J.

    2012-04-01

    The last decade has seen much research in producing and increasing the reliability of probabilistic hydro-meteorological forecasts following the promise that armed with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating uncertain forecasts includes preparing tools and products for visualization but also understanding how forecasters perceive and use uncertain information in real-time for decision-making. The question of proper communication has by no means been conclusively answered, nor has the question of improved decision making. In this presentation we will engage in a small but exciting live experiment in which several cases of flood forecasts and a multiple choice of actions will be presented to participants, who will act as decision makers. Answers will be collected and analyzed directly. Results will be presented and discussed together with the participants of the session to see if indeed we make better decisions on the basis of probabilistic forecasts.

  12. Application of risk-based methods to optimize inspection planning for regulatory activities at nuclear power plants

    SciTech Connect

    Wong, S.M.; Higgins, J.C.; Martinez-Guridi, G.

    1995-07-01

    As part of regulatory oversight requirements, the U.S. Nuclear Regulatory Commission (USNRC) staff conducts inspection activities to assess operational safety performance in nuclear power plants. Currently, guidance in these inspections is provided by procedures in the NRC Inspection Manual and issuance of Temporary Instructions defining the objectives and scope of the inspection effort. In several studies sponsored by the USNRC over the last few years, Brookhaven National Laboratory (BNL) has developed and applied methodologies for providing risk-based inspection guidance for the safety assessments of nuclear power plant systems. One recent methodology integrates insights from existing Probabilistic Risk Assessment (PRA) studies and Individual Plant Evaluations (TPE) with information from operating experience reviews for consideration in inspection planning for either multi-disciplinary team inspections or individual inspections. In recent studies at BNL, a risk-based methodology was developed to optimize inspection planning for regulatory activities at nuclear power plants. This methodology integrates risk-based insights from the plant configuration risk profile and risk information found in existing PRA/IPE studies.

  13. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  14. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  15. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  16. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  17. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  18. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  19. Risky business: the risk-based, risk-sharing capitated HMO.

    PubMed

    Kazahaya, G I

    1986-08-01

    Hospitals are encountering a new type of HMO--the risk-based, risk-sharing capitated HMO. This new HMO arrangement redefines the role of the hospital, the physicians, and the HMO plan involved. Instead of placing the HMO at risk, the hospital and physicians are now financially responsible for services covered under the HMO plan. The capitated HMO is reduced to a third-party payer, serving as a broker between subscribers and providers. In this first of two articles on capitated HMOs, the risk-based, risk-sharing capitated HMO and its relationship to hospitals and physicians is defined. The second article will take this definition and apply it to managing, monitoring, and reporting on these types of programs from an accounting perspective. PMID:10277301

  20. Orbitofrontal or accumbens dopamine depletion does not affect risk-based decision making in rats.

    PubMed

    Mai, Bettina; Hauber, Wolfgang

    2015-09-01

    Considerable evidence has implicated dopamine (DA) signals in target regions of midbrain DA neurons such as the medial prefrontal cortex or the core region of the nucleus accumbens in controlling risk-based decision-making. However, to date little is known about the contribution of DA in the orbitofrontal cortex (OFC) and the medial shell region of the nucleus accumbens (AcbS) to risk-based decision-making. Here we examined in rats the effects of 6-hydroxydopamine-induced DA depletions of the OFC and AcbS on risky choice using an instrumental two-lever choice task that requires the assessment of fixed within-session reward probabilities that were shifted across subsequent sessions, i.e., rats had to choose between two levers, a small/certain lever that delivered one certain food reward (one pellet at p = 1) and a large/risky lever that delivered a larger uncertain food reward with decreasing probabilities across subsequent sessions (four pellets at p = 0.75, 0.5, 0.25, 0.125, 0.0625). Results show that systemic administration of amphetamine or cocaine increased the preference for the large/risky lever. Results further demonstrate that, like sham controls, rats with OFC or AcbS DA depletion were sensitive to changes in probabilities for obtaining the large/risky reward across sessions and displayed probabilistic discounting. These findings point to the view that the basal capacity to evaluate the magnitude and likelihood of rewards associated with alternative courses of action as well as long-term changes of reward probabilities does not rely on DA input to the AcbS or OFC. PMID:25860659

  1. A risk-based approach to scheduling audits.

    PubMed

    Rönninger, Stephan; Holmes, Malcolm

    2009-01-01

    The manufacture and supply of pharmaceutical products can be a very complex operation. Companies may purchase a wide variety of materials, from active pharmaceutical ingredients to packaging materials, from "in company" suppliers or from third parties. They may also purchase or contract a number of services such as analysis, data management, audit, among others. It is very important that these materials and services are of the requisite quality in order that patient safety and company reputation are adequately protected. Such quality requirements are ongoing throughout the product life cycle. In recent years, assurance of quality has been derived via audit of the supplier or service provider and by using periodic audits, for example, annually or at least once every 5 years. In the past, companies may have used an audit only for what they considered to be "key" materials or services and used testing on receipt, for example, as their quality assurance measure for "less important" supplies. Such approaches changed as a result of pressure from both internal sources and regulators to the time-driven audit for all suppliers and service providers. Companies recognised that eventually they would be responsible for the quality of the supplied product or service and audit, although providing only a "snapshot in time" seemed a convenient way of demonstrating that they were meeting their obligations. Problems, however, still occur with the supplied product or service and will usually be more frequent from certain suppliers. Additionally, some third-party suppliers will no longer accept routine audits from individual companies, as the overall audit load can exceed one external audit per working day. Consequently a different model is needed for assessing supplier quality. This paper presents a risk-based approach to creating an audit plan and for scheduling the frequency and depth of such audits. The approach is based on the principles and process of the Quality Risk Management

  2. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  3. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches. PMID:21609883

  4. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  5. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  6. Rule Learning with Probabilistic Smoothing

    NASA Astrophysics Data System (ADS)

    Costa, Gianni; Guarascio, Massimo; Manco, Giuseppe; Ortale, Riccardo; Ritacco, Ettore

    A hierarchical classification framework is proposed for discriminating rare classes in imprecise domains, characterized by rarity (of both classes and cases), noise and low class separability. The devised framework couples the rules of a rule-based classifier with as many local probabilistic generative models. These are trained over the coverage of the corresponding rules to better catch those globally rare cases/classes that become less rare in the coverage. Two novel schemes for tightly integrating rule-based and probabilistic classification are introduced, that classify unlabeled cases by considering multiple classifier rules as well as their local probabilistic counterparts. An intensive evaluation shows that the proposed framework is competitive and often superior in accuracy w.r.t. established competitors, while overcoming them in dealing with rare classes.

  7. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  8. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  9. Application of the risk-based strategy to the Hanford tank waste organic-nitrate safety issue

    SciTech Connect

    Hunter, V.L.; Colson, S.D.; Ferryman, T.; Gephart, R.E.; Heasler, P.; Scheele, R.D.

    1997-12-01

    This report describes the results from application of the Risk-Based Decision Management Approach for Justifying Characterization of Hanford Tank Waste to the organic-nitrate safety issue in Hanford single-shell tanks (SSTs). Existing chemical and physical models were used, taking advantage of the most current (mid-1997) sampling and analysis data. The purpose of this study is to make specific recommendations for planning characterization to help ensure the safety of each SST as it relates to the organic-nitrate safety issue. An additional objective is to demonstrate the viability of the Risk-Based Strategy for addressing Hanford tank waste safety issues.

  10. Probabilistic coding of quantum states

    SciTech Connect

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-07-15

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding.

  11. Probabilistic framework for network partition

    NASA Astrophysics Data System (ADS)

    Li, Tiejun; Liu, Jian; E, Weinan

    2009-08-01

    Given a large and complex network, we would like to find the partition of this network into a small number of clusters. This question has been addressed in many different ways. In a previous paper, we proposed a deterministic framework for an optimal partition of a network as well as the associated algorithms. In this paper, we extend this framework to a probabilistic setting, in which each node has a certain probability of belonging to a certain cluster. Two classes of numerical algorithms for such a probabilistic network partition are presented and tested. Application to three representative examples is discussed.

  12. Role of Context in Risk-Based Reasoning

    ERIC Educational Resources Information Center

    Pratt, Dave; Ainley, Janet; Kent, Phillip; Levinson, Ralph; Yogui, Cristina; Kapadia, Ramesh

    2011-01-01

    In this article we report the influence of contextual factors on mathematics and science teachers' reasoning in risk-based decision-making. We examine previous research that presents judgments of risk as being subjectively influenced by contextual factors and other research that explores the role of context in mathematical problem-solving. Our own…

  13. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  14. How Should Risk-Based Regulation Reflect Current Public Opinion?

    PubMed

    Pollock, Christopher John

    2016-08-01

    Risk-based regulation of novel agricultural products with public choice manifest via traceability and labelling is a more effective approach than the use of regulatory processes to reflect public concerns, which may not always be supported by evidence. PMID:27266813

  15. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  16. Development of a risk-based approach to Hanford Site cleanup

    SciTech Connect

    Hesser, W.A.; Daling, P.M.; Baynes, P.A.

    1995-06-01

    In response to a request from Mr. Thomas Grumbly, Assistant Secretary of Energy for Environmental Management, the Hanford Site contractors developed a conceptual set of risk-based cleanup strategies that (1) protect the public, workers, and environment from unacceptable risks; (2) are executable technically; and (3) fit within an expected annual funding profile of 1.05 billion dollars. These strategies were developed because (1) the US Department of Energy and Hanford Site budgets are being reduced, (2) stakeholders are dissatisfied with the perceived rate of cleanup, (3) the US Congress and the US Department of Energy are increasingly focusing on risk and riskreduction activities, (4) the present strategy is not integrated across the Site and is inconsistent in its treatment of similar hazards, (5) the present cleanup strategy is not cost-effective from a risk-reduction or future land use perspective, and (6) the milestones and activities in the Tri-Party Agreement cannot be achieved with an anticipated funding of 1.05 billion dollars annually. The risk-based strategies described herein were developed through a systems analysis approach that (1) analyzed the cleanup mission; (2) identified cleanup objectives, including risk reduction, land use, and mortgage reduction; (3) analyzed the existing baseline cleanup strategy from a cost and risk perspective; (4) developed alternatives for accomplishing the cleanup mission; (5) compared those alternatives against cleanup objectives; and (6) produced conclusions and recommendations regarding the current strategy and potential risk-based strategies.

  17. Auxiliary feedwater system risk-based inspection guide for the South Texas Project nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1993-12-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. South Texas Project was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the South Texas Project plant.

  18. Auxiliary feedwater system risk-based inspection guide for the H. B. Robinson nuclear power plant

    SciTech Connect

    Moffitt, N.E.; Lloyd, R.C.; Gore, B.F.; Vo, T.V.; Garner, L.W.

    1993-08-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. H. B. Robinson was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the H. B. Robinson plant.

  19. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant.

  20. Auxiliary feedwater system risk-based inspection guide for the J. M. Farley Nuclear Power Plant

    SciTech Connect

    Vo, T.V.; Pugh, R.; Gore, B.F.; Harrison, D.G. )

    1990-10-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment(PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. J. M. Farley was selected as the second plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important at the J. M. Farley plant. 23 refs., 1 fig., 1 tab.

  1. Auxiliary feedwater system risk-based inspection guide for the Ginna Nuclear Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.; Moffitt, N.E. )

    1991-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Ginna was selected as the eighth plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Ginna plant. 23 refs., 1 fig., 1 tab.

  2. Auxiliary feedwater system risk-based inspection guide for the Byron and Braidwood nuclear power plants

    SciTech Connect

    Moffitt, N.E.; Gore, B.F.: Vo, T.V. )

    1991-07-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Byron and Braidwood were selected for the fourth study in this program. The produce of this effort is a prioritized listing of AFW failures which have occurred at the plants and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Byron/Braidwood plants. 23 refs., 1 fig., 1 tab.

  3. Risk-based inspection priorities for PWR high-pressure injection system components

    SciTech Connect

    Vo, T.V.; Simonen, F.A.; Phan, H.K. )

    1993-01-01

    Under U.S. Nuclear Regulatory Commission sponsorship, Pacific Northwest Laboratory developed a risk-based method that can be used to establish in-service inspection priorities for nuclear power plant components. The overall goal of this effort was to develop technical bases for improvements of inspection plans and to provide recommendations for revisions of the American Society of Mechanical Engineers Boiler and Pressure Vessel Code, Sec. XI. The developed method used results of probabilistic risk assessment in combination with the failure modes and effects analysis (FMEA) technique to establish in-service inspection priorities for systems and components. The Surry nuclear power station, unit 1 (Surry-1) was selected for study. Inspection priorities for several pressure boundary systems at Surry-1 were determined in the early phase of the project. To complete the study, the remaining safety systems, plus balance of plant, have been analyzed; one of these is the high-pressure injection (HPI) system. This paper presents the results of inspection priorities for the HPI system.

  4. Auxiliary feedwater system risk-based inspection guide for the Point Beach nuclear power plant

    SciTech Connect

    Lloyd, R C; Moffitt, N E; Gore, B F; Vo, T V; Vehec, T A

    1993-02-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Point Beach was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRS. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Point Beach plant.

  5. Auxiliary feedwater system risk-based inspection guide for the North Anna nuclear power plants

    SciTech Connect

    Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V. )

    1992-10-01

    In a study sponsored by the US Nuclear regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. North Anna was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the North Anna plant.

  6. Probabilistic forecasts based on radar rainfall uncertainty

    NASA Astrophysics Data System (ADS)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.

  7. Probabilistic Route Selection Algorithm for IP Traceback

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Bin; Jung, Jae-Il

    DoS(Denial of Service) or DDoS(Distributed DoS) attack is a major threaten and the most difficult problem to solve among many attacks. Moreover, it is very difficult to find a real origin of attackers because DoS/DDoS attacker uses spoofed IP addresses. To solve this problem, we propose a probabilistic route selection traceback algorithm, namely PRST, to trace the attacker's real origin. This algorithm uses two types of packets such as an agent packet and a reply agent packet. The agent packet is in use to find the attacker's real origin and the reply agent packet is in use to notify to a victim that the agent packet is reached the edge router of the attacker. After attacks occur, the victim generates the agent packet and sends it to a victim's edge router. The attacker's edge router received the agent packet generates the reply agent packet and send it to the victim. The agent packet and the reply agent packet is forwarded refer to probabilistic packet forwarding table (PPFT) by routers. The PRST algorithm runs on the distributed routers and PPFT is stored and managed by routers. We validate PRST algorithm by using mathematical approach based on Poisson distribution.

  8. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  9. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  10. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  11. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  12. 12 CFR 1022.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Content, form, and timing of risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing...

  13. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  14. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  15. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  16. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  17. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  18. Probabilistic Techniques for Phrase Extraction.

    ERIC Educational Resources Information Center

    Feng, Fangfang; Croft, W. Bruce

    2001-01-01

    This study proposes a probabilistic model for automatically extracting English noun phrases for indexing or information retrieval. The technique is based on a Markov model, whose initial parameters are estimated by a phrase lookup program with a phrase dictionary, then optimized by a set of maximum entropy parameters. (Author/LRW)

  19. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  20. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  1. Designing Probabilistic Tasks for Kindergartners

    ERIC Educational Resources Information Center

    Skoumpourdi, Chrysanthi; Kafoussi, Sonia; Tatsis, Konstantinos

    2009-01-01

    Recent research suggests that children could be engaged in probability tasks at an early age and task characteristics seem to play an important role in the way children perceive an activity. To this direction in the present article we investigate the role of some basic characteristics of probabilistic tasks in their design and implementation. In…

  2. Comprehensive Risk-Based Diagnostically Driven Treatment Planning: Developing Sequentially Generated Treatment.

    PubMed

    Kois, Dean E; Kois, John C

    2015-07-01

    The clinical example presented in this article demonstrates a risk-based, diagnostically driven treatment planning approach by focusing on 4 key categories: periodontal, biomechanical, functional, dentofacial. In addition, our unique approach allowed the comprehensive clinical management of a patient with complex restorative needs. A full-mouth rehabilitation was completed sequentially without sacrificing the amount of dentistry necessary to restore health, comfort, function, and esthetics. The result exceeded the patient's expectation and was made financially possible by extending treatment over numerous years. PMID:26140967

  3. Fuzzy logic and a risk-based graded approach for developing S/RIDs: An introduction

    SciTech Connect

    Wayland, J.R.

    1996-01-01

    A Standards/Requirements Identification Document (S/RID) is the set of expressed performance expectations, or standards, for a facility. Critical to the development of an integrated standards-based management is the identification of a set of necessary and sufficient standards from a selected set of standards/requirements. There is a need for a formal, rigorous selection process for the S/RIDs. This is the first of three reports that develop a fuzzy logic selection process. In this report the fundamentals of fuzzy logic are discussed as they apply to a risk-based graded approach.

  4. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  5. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  6. Probabilistic Fatigue And Flaw-Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Newlin, Laura; Ebbeler, Donald; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue and Flaw Propagation (PFAFAT II) package of software utilizing probabilistic failure-assessment (PFA) methodology to model flaw-propagation and low-cycle-fatigue modes of failure of structural components. Comprises one program for performing probabilistic crack-growth analysis and two programs for performing probabilistic low-cycle-fatigue analysis. These programs perform probabilistic fatigue and crack-propagation analysis by means of Monte Carlo simulation. PFAFAT II is extension of, rather than replacement for, PFAFAT software (NPO-18965). Written in FORTRAN 77.

  7. Risk-based decision making for terrorism applications.

    PubMed

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application. PMID:19187486

  8. Coupling risk-based remediation with innovative technology

    SciTech Connect

    Goodheart, G.F.; Teaf, C.M. |; Manning, M.J.

    1998-05-01

    Tiered risk-based cleanup approaches have been effectively used at petroleum sites, pesticide sites and other commercial/industrial facilities. For example, the Illinois Environmental Protection Agency (IEPA) has promulgated guidance for a Tiered Approach to Corrective action Objectives (TACO) to establish site-specific remediation goals for contaminated soil and groundwater. As in the case of many other state programs, TACO is designed to provide for adequate protection of human health and the environment based on potential risks posed by site conditions. It also incorporates site-related information that may allow more cost-effective remediation. IEPA developed TACO to provide flexibility to site owners/operators when formulating site-specific remediation activities, as well as to hasten property redevelopment to return sites to more productive use. Where appropriate, risk-based cleanup objectives as set by TACO-type programs may be coupled with innovative remediation technologies such as air sparging, bioremediation and soil washing.

  9. Risk based ISI application to a boiling water reactor

    SciTech Connect

    Smith, A.; Dimitrijevic, V.B.; O`Regan, P.J.

    1996-12-01

    The ASME Section XI Working Group on Implementation of Risk-Based Examination produced a code case to define risk-based selection rules that could be used for In-Service Inspection (ISI) of Class 1, 2, and 3 piping. To provide guidelines for practical implementation of the code case, EPRI sponsored work to develop evaluation procedures and criteria. As part of an EPRI sponsored pilot study, these procedures have been applied to a BWR plant. Piping within the scope of the existing Section XI program has been analyzed. The results of this effort indicate that implementation of RBISI programs can significantly reduce the cost and radiation exposure associated with in-service inspections. The revised program was compared to the previous program and a net gain in safety benefit was demonstrated.

  10. Protecting the Smart Grid: A Risk Based Approach

    SciTech Connect

    Clements, Samuel L.; Kirkham, Harold; Elizondo, Marcelo A.; Lu, Shuai

    2011-10-10

    This paper describes a risk-based approach to security that has been used for years in protecting physical assets, and shows how it could be modified to help secure the digital aspects of the smart grid and control systems in general. One way the smart grid has been said to be vulnerable is that mass load fluctuations could be created by quickly turning off and on large quantities of smart meters. We investigate the plausibility.