Science.gov

Sample records for probabilistic risk-based management

  1. Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Roberts, Mike; Rajagopalan, Balaji; Sojda, Richard S.

    2013-08-01

    Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very "sharp"). Synthetic forecasts show that a modest "sharpening" can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.

  2. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  3. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  4. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    SciTech Connect

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  5. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  6. A risk-based decision-making game relevant to water management. Try it yourself!

    NASA Astrophysics Data System (ADS)

    Pappenberger, Florian; van Andel, Schalk Jan; Wood, Andy; Ramos, Maria-Helena

    2013-04-01

    Monthly or seasonal streamflow forecasts are essential to improve water planning (eg., water allocation) and anticipate severe events like droughts. Additionally, multipurpose water reservoirs usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. Given the need to take into account uncertainties at long lead times to allow for optimal risk-based decisions, the use of probabilistic forecasts in this context is inevitable. In this presentation, we will engage a risk-based decision-making game, where each participant will act as a water manager. A sequence of probabilistic inflow forecasts will be presented to be used to make a reservoir release decision at a monthly time-step, subject to a few constraints -- e.g., an end of year target pool elevation, a maximum release and a minimum downstream flow. After each decision, the actual inflow will be presented and the consequences of the decisions made will be discussed together with the participants of the session. This experience will allow participants to experience firsthand the challenges of probabilistic, quantitative decision-making.

  7. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  8. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  9. Promoting justified risk-based decisions in contaminated land management.

    PubMed

    Reinikainen, Jussi; Sorvari, Jaana

    2016-09-01

    Decision making and regulatory policies on contaminated land management (CLM) are commonly governed by risk assessment. Risk assessment, thus, has to comply with legislation, but also provide valid information in terms of actual risks to correctly focus the potentially required measures and allocate the available resources. Hence, reliable risk assessment is a prerequisite for justified and sustainable risk management. This paper gives an introduction to the Finnish risk-based regulatory framework, outlines the challenges within the policies and the practice and provides an overview of the new guidance document to promote risk-based and sustainable CLM. We argue that the current risk assessment approaches in the policy frameworks are not necessarily efficient enough in supporting justified risk-based decisions. One of the main reasons for this is the excessive emphasis put on conservative risk assessments and on generic guideline values without contributing to their appropriate application. This paper presents how some of the challenges in risk-based decision making have been tackled in the Finnish regulatory framework on contaminated land. We believe that our study will also stimulate interest with regard to policy frameworks in other countries. PMID:26767620

  10. Science, science policy, and risk-based management

    SciTech Connect

    Midgley, L.P.

    1997-09-01

    Recent national awareness of the economic infeasibility of remediating hazardous waste sites to background levels has sparked increased interest in the role of science policy in the environmental risk assessment and risk management process. As individual states develop guidelines for addressing environmental risks at hazardous waste sites, the role of science policy decisions and uncertainty must be carefully evaluated to achieve long-term environmental goals and solutions that are economically feasible and optimally beneficial to all stakeholders. Amendment to Oregon Revised Statute 465.315 establishes policy and Utah Cleanup Action and Risk-Based Closure Standards sets requirements for risk-based cleanup and closure at sites where remediation or removal of hazardous constituents to background levels will not be achieved. This paper discusses the difficulties in effectively integrating potential current and future impacts on human health and the environment, technical feasibility, economic considerations, and political realities into environmental policy and standards, using these references as models. This paper considers the role of both objective and subjective criteria in the risk-based closure and management processes and makes suggestions for improving the system by which these sites may be reclaimed.

  11. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  12. Application of probabilistic safety assessment models to risk-based inspection of piping

    SciTech Connect

    Chapman, J.

    1996-12-01

    From the beginning, one of the most useful applications of Probabilistic Safety Assessment (PSA) is its use in evaluating the risk importance of changes to plant design, operations, or other plant conditions. Risk importance measures the impact of a change on the risk. Risk is defined as a combination of the likelihood of failure and consequence of the failure. The consequence can be safety system unavailability, core melt frequency, early release, or various other consequence measures. The goal in this PSA application is to evaluate the risk importance of an ISI process, as applied to plant piping systems. Two approaches can be taken in this evaluation: Current PSA Approach or the Blended Approach. Both are discussed here.

  13. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  14. Risk-based principles for defining and managing water security.

    PubMed

    Hall, Jim; Borgomeo, Edoardo

    2013-11-13

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  15. Towards risk-based drought management in the Netherlands: quantifying the welfare effects of water shortage

    NASA Astrophysics Data System (ADS)

    van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens

    2016-04-01

    It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some

  16. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  17. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  18. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  19. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  20. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can

  1. The effects of climate model similarity on probabilistic climate projections and the implications for local, risk-based adaptation planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, Scott; McCrary, Rachel; Mearns, Linda O.; Brown, Casey

    2015-06-01

    Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate-related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.

  2. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  3. A risk-based probabilistic framework to estimate the endpoint of remediation: Concentration rebound by rate-limited mass transfer

    NASA Astrophysics Data System (ADS)

    Barros, F. P. J.; Fernã Ndez-Garcia, D.; Bolster, D.; Sanchez-Vila, X.

    2013-04-01

    Aquifer remediation is a challenging problem with environmental, social, and economic implications. As a general rule, pumping proceeds until the concentration of the target substance within the pumped water lies below a prespecified value. In this paper we estimate the a priori potential failure of the endpoint of remediation due to a rebound of concentrations driven by back diffusion. In many cases, it has been observed that once pumping ceases, a rebound in the concentration at the well takes place. For this reason, administrative approaches are rather conservative, and pumping is forced to last much longer than initially expected. While a number of physical and chemical processes might account for the presence of rebounding, we focus here on diffusion from low water mobility into high mobility zones. In this work we look specifically at the concentration rebound when pumping is discontinued while accounting for multiple mass transfer processes occurring at different time scales and parametric uncertainty. We aim to develop a risk-based optimal operation methodology that is capable of estimating the endpoint of remediation based on aquifer parameters characterizing the heterogeneous medium as well as pumping rate and initial size of the polluted area.

  4. Method for Water Management Considering Long-term Probabilistic Forecasts

    NASA Astrophysics Data System (ADS)

    Hwang, J.; Kang, J.; Suh, A. S.

    2015-12-01

    This research is aimed at predicting the monthly inflow of the Andong-dam basin in South Korea using long-term probabilistic forecasts to apply long-term forecasts to water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  5. Probabilistic structural risk assessment for fatigue management using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Shiao, Michael; Wu, Y.-T. J.; Ghoshal, Anindya; Ayers, James; Le, Dy

    2012-04-01

    The primary goal of Army Prognostics & Diagnostics is to develop real-time state awareness technologies for primary structural components. In fatigue-critical structural maintenance, the probabilistic structural risk assessment (PSRA) methodology for fatigue life management using conventional nondestructive investigation (NDI) has been developed based on the assumption of independent inspection outcomes. When using the emerging structural health monitoring (SHM) systems with in situ sensors, however, the independent assumption no longer holds, and the existing PSRA methodology must be modified. The major issues currently under investigation are how to properly address the correlated inspection outcomes from the same sensors on the same components and how to quantify its effect in the SHM-based PSRA framework. This paper describes a new SHM-based PSRA framework with a proper modeling of correlations among multiple inspection outcomes of the same structural component. The framework and the associated probabilistic algorithms are based on the principles of fatigue damage progression, NDI reliability assessment and structural reliability methods. The core of this framework is an innovative, computationally efficient, probabilistic method RPI (Recursive Probability Integration) for damage tolerance and risk-based maintenance planning. RPI can incorporate a wide range of uncertainties including material properties, repair quality, crack growth related parameters, loads, and probability of detection. The RPI algorithm for SHM application is derived in detail. The effects of correlation strength and inspection frequency on the overall probability of missing all detections are also studied and discussed.

  6. A risk-based framework for water resource management under changing water availability, policy options, and irrigation expansion

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2016-08-01

    resemble nonlinear functions of changes in individual drivers. The proposed risk-based framework can be linked to any water resource system assessment scheme to quantify the risk in system performance under changing conditions, with the larger goal of proposing alternative policy options to address future uncertainties and management concerns.

  7. The role of risk-based prioritization in total quality management

    SciTech Connect

    Bennett, C.T.

    1994-10-01

    The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approach - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.

  8. Developing risk-based screening guidelines for dioxin management at a Melbourne sewage treatment plant.

    PubMed

    Gorman, J; Mival, K; Wright, J; Howell, M

    2003-01-01

    Dioxin is a generic term used to refer to the congeners of polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs). The principal source of dioxin production is generally thought to be from unintended by-products of waste incineration, but dioxins are also naturally formed from volcanic activity and forest fires (WHO, 1998). Estimates of dioxin emissions in Australia suggest that approximately 75% of the total PCDD and PCDF emissions derive from prescribed burning and wild bushfires. Currently, no screening guidelines for dioxins within soils are available in Australia. This paper presents the general approach and results of a human health risk-based assessment performed by URS Australia in 2001 to develop site specific reference criteria for remediation of a former sewage plant in Melbourne. Risk-based soil remediation concentrations for dioxins at the sewage treatment plant site were developed using tolerable daily intake values of 4, 2 and 1 pg/kg/day. The potentially significant exposure pathways and processes for exposure to dioxins were identified and risk-based soil concentrations derived in accordance with the general method framework presented in the National Environmental Protection Measure (Assessment of Site Contamination). The derived dioxin reference criteria were used to develop an effective risk management program focussed on those conditions that present the greatest contribution to overall risk to human health. PMID:12862210

  9. Waste management project's alternatives: A risk-based multi-criteria assessment (RBMCA) approach

    SciTech Connect

    Karmperis, Athanasios C.; Sotirchos, Anastasios; Aravossis, Konstantinos; Tatsiopoulos, Ilias P.

    2012-01-15

    Highlights: Black-Right-Pointing-Pointer We examine the evaluation of a waste management project's alternatives. Black-Right-Pointing-Pointer We present a novel risk-based multi-criteria assessment (RBMCA) approach. Black-Right-Pointing-Pointer In the RBMCA the evaluation criteria are based on the quantitative risk analysis of the project's alternatives. Black-Right-Pointing-Pointer Correlation between the criteria weight values and the decision makers' risk preferences is examined. Black-Right-Pointing-Pointer Preference to the multi-criteria against the one-criterion evaluation process is discussed. - Abstract: This paper examines the evaluation of a waste management project's alternatives through a quantitative risk analysis. Cost benefit analysis is a widely used method, in which the investments are mainly assessed through the calculation of their evaluation indicators, namely benefit/cost (B/C) ratios, as well as the quantification of their financial, technical, environmental and social risks. Herein, a novel approach in the form of risk-based multi-criteria assessment (RBMCA) is introduced, which can be used by decision makers, in order to select the optimum alternative of a waste management project. Specifically, decision makers use multiple criteria, which are based on the cumulative probability distribution functions of the alternatives' B/C ratios. The RBMCA system is used for the evaluation of a waste incineration project's alternatives, where the correlation between the criteria weight values and the decision makers' risk preferences is analyzed and useful conclusions are discussed.

  10. Achievements of risk-based produced water management on the Norwegian continental shelf (2002-2008).

    PubMed

    Smit, Mathijs G D; Frost, Tone K; Johnsen, Ståle

    2011-10-01

    In 1996, the Norwegian government issued a White Paper requiring the Norwegian oil industry to reach the goal of "zero discharge" for the marine environment by 2005. To achieve this goal, the Norwegian oil and gas industry initiated the Zero Discharge Programme for discharges of produced formation water from the hydrocarbon-containing reservoir, in close communication with regulators. The environmental impact factor (EIF), a risk-based management tool, was developed by the industry to quantify and document the environmental risks from produced water discharges. The EIF represents a volume of recipient water containing concentrations of one or more substances to a level exceeding a generic threshold for ecotoxicological effects. In addition, this tool facilitates the identification and selection of cost-effective risk mitigation measures. The EIF tool has been used by all operators on the Norwegian continental shelf since 2002 to report progress toward the goal of "zero discharge," interpreted as "zero harmful discharges," to the regulators. Even though produced water volumes have increased by approximately 30% between 2002 and 2008 on the Norwegian continental shelf, the total environmental risk from produced water discharges expressed by the summed EIF for all installations has been reduced by approximately 55%. The total amount of oil discharged to the sea has been reduced by 18% over the period 2000 to 2006. The experience from the Zero Discharge Programme shows that a risk-based approach is an excellent working tool to reduce discharges of potential harmful substances from offshore oil and gas installations. PMID:21594986

  11. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    SciTech Connect

    Huq, M; Palta, J; Dunscombe, P; Thomadsen, B

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapy process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what

  12. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies. PMID:27062002

  13. Risk-based Inspection Scheduling Planning for Intelligent Agent in the Autonomous Fault Management

    SciTech Connect

    Hari Nugroho, Djoko; Sudarno

    2010-06-22

    This paper developed an autonomous fault management focusing to the inspection scheduling planning which was implemented to the advanced small nuclear reactor without on-site refuelling to assure the safety without human intervention. The inspection scheduling planning was developed optimally on the risk-based approach compromising between two important constraints related to the risk of action planning as such failure probability and shortest path. Performance was represented using computer simulation implemented to the DURESS components location and failure probability. It could be concluded that the first priority to be inspected was flow sensor FB2 which had the largest comparation value of 0.104233 comparing with the other components. The next route would be visited were sequentially FB1, FA2, FA1, FB, FA, VB, pump B, VA, pump A, VB2, VB1, VA2, VA1, reservoir 2, reservoir 1, FR2, and FR1. The movement route planning could be transferred to activate the robot arm which reflected as intelligent agent.

  14. National Drought Policy: Shifting the Paradigm from Crisis to Risk-based Management

    NASA Astrophysics Data System (ADS)

    Wilhite, D. A.; Sivakumar, M. K.; Stefanski, R.

    2011-12-01

    Drought is a normal part of climate for virtually all of the world's climatic regimes. To better address the risks associated with this hazard and societal vulnerability, there must be a dramatic paradigm shift in our approach to drought management in the coming decade in the light of the increasing frequency of droughts and projections of increased severity and duration of these events in the future for many regions, especially in the developing world. Addressing this challenge will require an improved awareness of drought as a natural hazard, the establishment of integrated drought monitoring and early warning systems, a higher level of preparedness that fully incorporates risk-based management, and the adoption of national drought policies that are directed at increasing the coping capacity and resilience of populations to future drought episodes. The World Meteorological Organization (WMO), in partnership with other United Nations' agencies, the National Drought Mitigation Center at the University of Nebraska, NOAA, the U.S. Department of Agriculture, and other partners, is currently launching a program to organize a High Level Meeting on National Drought Policy (HMNDP) in March 2013 to encourage the development of national drought policies through the development of a compendium of key policy elements. The key objectives of a national drought policy are to: (1) encourage vulnerable economic sectors and population groups to adopt self-reliant measures that promote risk management; (2) promote sustainable use of the agricultural and natural resource base; and (3) facilitate early recovery from drought through actions consistent with national drought policy objectives. The key elements of a drought policy framework are policy and governance, including political will; addressing risk and improving early warnings, including vulnerability analysis, impact assessment, and communication; mitigation and preparedness, including the application of effective and

  15. Toward a holistic and risk-based management of European river basins.

    PubMed

    Brack, Werner; Apitz, Sabine E; Borchardt, Dietrich; Brils, Jos; Cardoso, Ana Cristina; Foekema, Edwin M; van Gils, Jos; Jansen, Stefan; Harris, Bob; Hein, Michaela; Heise, Susanne; Hellsten, Seppo; de Maagd, P Gert-Jan; Müller, Dietmar; Panov, Vadim E; Posthuma, Leo; Quevauviller, Philippe; Verdonschot, Piet F M; von der Ohe, Peter C

    2009-01-01

    The European Union Water Framework Directive (WFD) requires a good chemical and ecological status of European surface waters by 2015. Integrated, risk-based management of river basins is presumed to be an appropriate approach to achieve that goal. The approach of focusing on distinct hazardous substances in surface waters together with investment in best available technology for treatment of industrial and domestic effluents was successful in significantly reducing excessive contamination of several European river basins. The use of the concept of chemical status in the WFD is based on this experience and focuses on chemicals for which there is a general agreement that they should be phased out. However, the chemical status, based primarily on a list of 33 priority substances and 8 priority hazardous substances, considers only a small portion of possible toxicants and does not address all causes of ecotoxicological stress in general. Recommendations for further development of this concept are 1) to focus on river basin-specific toxicants, 2) to regularly update priority lists with a focus on emerging toxicants, 3) to consider state-of-the-art mixture toxicity concepts and bioavailability to link chemical and ecological status, and 4) to add a short list of priority effects and to develop environmental quality standards for these effects. The ecological status reflected by ecological quality ratios is a leading principle of the WFD. While on the European scale the improvement of hydromorphological conditions and control of eutrophication are crucial to achieve a good ecological status, on a local and regional scale managers have to deal with multiple pressures. On this scale, toxic pollution may play an important role. Strategic research is necessary 1) to identify dominant pressures, 2) to predict multistressor effects, 3) to develop stressor- and type-specific metrics of pressures, and 4) to better understand the ecology of recovery. The concept of reference

  16. Seasonal Water Resources Management and Probabilistic Operations Forecast in the San Juan Basin

    NASA Astrophysics Data System (ADS)

    Daugherty, L.; Zagona, E. A.; Rajagopalan, B.; Grantz, K.; Miller, W. P.; Werner, K.

    2013-12-01

    within the NWS Community Hydrologic Prediction System (CHPS) to produce an ensemble streamflow forecast. The ensemble traces are used to drive the MTOM with the initial conditions of the water resources system and the operating rules, to provide ensembles of water resources management and operation metrics. We applied this integrated approach to forecasting in the San Juan River Basin (SJRB) using a portion of the Colorado River MTOM. The management objectives in the basin include water supply for irrigation, tribal water rights, environmental flows, and flood control. The spring streamflow ensembles were issued at four different lead times on the first of each month from January - April, and are incorporated into the MTOM for the period 2002-2010. Ensembles of operational performance metrics for the SJRB such as Navajo Reservoir releases, end of water year storage, environmental flows and water supply for irrigation were computed and their skills evaluated against variables obtained in a baseline simulation using historical streamflow. Preliminary results indicate that thus obtained probabilistic forecasts may produce increased skill especially at long lead time (e.g., on Jan and Feb 1st). The probabilistic information for water management variables provide risks of system vulnerabilities and thus enables risk-based efficient planning and operations.

  17. Dynamic Resource Management in Clouds: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Gonçalves, Paulo; Roy, Shubhabrata; Begin, Thomas; Loiseau, Patrick

    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this work we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. We show that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by “buzz/flash crowd effects” that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.

  18. A Risk-Based Approach to Evaluating Wildlife Demographics for Management in a Changing Climate: A Case Study of the Lewis's Woodpecker

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyère, Cindy L.; Newlon, Karen R.

    2012-12-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker ( Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  19. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    USGS Publications Warehouse

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyere, Cindy L.; Newlon, Karen R.

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker (Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  20. The future of monitoring in clinical research - a holistic approach: linking risk-based monitoring with quality management principles.

    PubMed

    Ansmann, Eva B; Hecht, Arthur; Henn, Doris K; Leptien, Sabine; Stelzer, Hans Günther

    2013-01-01

    Since several years risk-based monitoring is the new "magic bullet" for improvement in clinical research. Lots of authors in clinical research ranging from industry and academia to authorities are keen on demonstrating better monitoring-efficiency by reducing monitoring visits, monitoring time on site, monitoring costs and so on, always arguing with the use of risk-based monitoring principles. Mostly forgotten is the fact, that the use of risk-based monitoring is only adequate if all mandatory prerequisites at site and for the monitor and the sponsor are fulfilled.Based on the relevant chapter in ICH GCP (International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use - Good Clinical Practice) this publication takes a holistic approach by identifying and describing the requirements for future monitoring and the use of risk-based monitoring. As the authors are operational managers as well as QA (Quality Assurance) experts, both aspects are represented to come up with efficient and qualitative ways of future monitoring according to ICH GCP. PMID:23382708

  1. How to Quantify Sustainable Development: A Risk-Based Approach to Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  2. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  3. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. PMID:26734840

  4. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    NASA Astrophysics Data System (ADS)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  5. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  6. Risk based bridge data collection and asset management and the role of structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Bush, Simon; Henning, Theunis; McCarten, Peter

    2011-04-01

    Bridges are critical to the operation and functionality of the whole road networks. It is therefore essential that specific data is collected regarding bridge asset condition and performance, as this allows proactive management of the assets and associated risks and more accurate short and long term financial planning. This paper proposes and discusses a strategy for collection of data on bridge condition and performance. Recognizing that risk management is the primary driver of asset management, the proposed strategy prioritizes bridges for levels of data collection including core, intermediate and advanced. Individual bridges are seen as parts of wider networks and bridge risk and criticality assessment emphasizes bridge failure or underperformance risk in the network context. The paper demonstrates how more reliable and detailed data can assist in managing network and bridge risks and provides a rationale for application of higher data collection levels for bridges characterized by higher risk and criticality. As the bridge risk and/or criticality increases planned and proactive integration of structural health monitoring (SHM) data into asset management is outlined. An example of bridge prioritization for data collection using several bridges taken from a national highway network is provided using an existing risk and criticality scoring methodology. The paper concludes with a discussion on the role of SHM in data collection for bridge asset management and where SHM can make the largest impacts.

  7. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas. PMID:24869949

  8. Assistance to the states with risk based data management. Quarterly technical progress report, April 1--June 30, 1995

    SciTech Connect

    Paque, M.J.

    1995-07-28

    The Tasks of this project are to: (1) complete implementation of a Risk Based Data Management System (RBDMS) in the States of Alaska, Mississippi, Montana, Nebraska; and (2) conduct Area of Review (AOR) Workshops in the states of California, Oklahoma, Kansas, and Texas. The RBDMS was designed to be a comprehensive database with the ability to expand into multiple areas, including oil and gas production. The database includes comprehensive well information for both producing and injection wells. It includes automated features for performing functions redated to AOR analyses, environmental risk analyses, well evaluation, permit evaluation, compliance monitoring, operator bonding assessments, operational monitoring and tracking, and more. This quarterly report describes the status of the development of the RBDMS project in both stated tasks and proposes further steps in its implementation.

  9. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle. PMID:25294001

  10. Development of applicating probabilistic long-term forecasts into water management

    NASA Astrophysics Data System (ADS)

    Hwang, Jin; Ryoo, Kyongsik; Suh, Aesook

    2016-04-01

    This research shows development of applicating probabilistic long-term forecasts into water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  11. Communicating uncertainty: managing the inherent probabilistic character of hazard estimates

    NASA Astrophysics Data System (ADS)

    Albarello, Dario

    2013-04-01

    Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific

  12. A risk-based approach to managing active pharmaceutical ingredients in manufacturing effluent.

    PubMed

    Caldwell, Daniel J; Mertens, Birgit; Kappler, Kelly; Senac, Thomas; Journel, Romain; Wilson, Peter; Meyerhoff, Roger D; Parke, Neil J; Mastrocco, Frank; Mattson, Bengt; Murray-Smith, Richard; Dolan, David G; Straub, Jürg Oliver; Wiedemann, Michael; Hartmann, Andreas; Finan, Douglas S

    2016-04-01

    The present study describes guidance intended to assist pharmaceutical manufacturers in assessing, mitigating, and managing the potential environmental impacts of active pharmaceutical ingredients (APIs) in wastewater from manufacturing operations, including those from external suppliers. The tools are not a substitute for compliance with local regulatory requirements but rather are intended to help manufacturers achieve the general standard of "no discharge of APIs in toxic amounts." The approaches detailed in the present study identify practices for assessing potential environmental risks from APIs in manufacturing effluent and outline measures that can be used to reduce the risk, including selective application of available treatment technologies. These measures either are commonly employed within the industry or have been implemented to a more limited extent based on local circumstances. Much of the material is based on company experience and case studies discussed at an industry workshop held on this topic. PMID:26183919

  13. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  14. A Risk-Based Approach to Manage Nutrient Contamination From Household Wastewater

    NASA Astrophysics Data System (ADS)

    Gold, A. J.; Sims, J. T.

    2001-05-01

    Nutrients originating from decentralized wastewater treatment systems (DWTS) can pose a risk to human and ecosystem health. Assessing the likelihood and magnitude of this risk is a formidable and complex challenge. However, a properly constructed risk assessment is essential if we are to design and implement practices for DWTS that minimize the impacts of nutrients on our environment. To do this successfully, we must carefully consider: (i) the specific risks posed by nutrients emitted by DWTS and the sensitivity of humans and ecosystems to these risks; (ii) the pathways by which nutrients move from DWTS to the sectors of the environment where the risk will occur (most often ground and surface waters); (iii) the micro and macro-scale processes that affect the transport and transformations of nutrients once they are emitted from the DWTS and how this in turn affects risk; and (iv) the effects of current or alternative DWTS design and management practices on nutrient transport and subsequent risks to humans and ecosystems. In this paper we examine the risks of nutrients from DWTS to human and ecosystem health at both the micro and the macro?level spatial scales. We focus primarily on the factors that control the movement of N and P from DWTS to ground and surface waters and the research needs related to controlling nonpoint source nutrient pollution from DWTS. At the micro?scale the exposure pathways include the system and the immediate surroundings, i.e., the subsurface environment near the DWTS. The exposed individual or ecosystem at the micro-scale can be a household well, lake, stream or estuary that borders an individual wastewater treatment system. At the macro?level our focus is at the aquifer and watershed scale and the risks posed to downstream ecosystems and water users by nonpoint source pollution of these waters by nutrients from DWTS. We analyze what is known about the effectiveness of current designs at mitigating these risks and our ability to predict

  15. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  16. A two-stage inexact joint-probabilistic programming method for air quality management under uncertainty.

    PubMed

    Lv, Y; Huang, G H; Li, Y P; Yang, Z F; Sun, W

    2011-03-01

    A two-stage inexact joint-probabilistic programming (TIJP) method is developed for planning a regional air quality management system with multiple pollutants and multiple sources. The TIJP method incorporates the techniques of two-stage stochastic programming, joint-probabilistic constraint programming and interval mathematical programming, where uncertainties expressed as probability distributions and interval values can be addressed. Moreover, it can not only examine the risk of violating joint-probability constraints, but also account for economic penalties as corrective measures against any infeasibility. The developed TIJP method is applied to a case study of a regional air pollution control problem, where the air quality index (AQI) is introduced for evaluation of the integrated air quality management system associated with multiple pollutants. The joint-probability exists in the environmental constraints for AQI, such that individual probabilistic constraints for each pollutant can be efficiently incorporated within the TIJP model. The results indicate that useful solutions for air quality management practices have been generated; they can help decision makers to identify desired pollution abatement strategies with minimized system cost and maximized environmental efficiency. PMID:21067860

  17. Towards risk-based drought management in the Netherlands: making water supply levels transparent to water users

    NASA Astrophysics Data System (ADS)

    Maat Judith, Ter; Marjolein, Mens; Vuren Saskia, Van; der Vat Marnix, Van

    2016-04-01

    Improving Predictions and Management of Hydrological Extremes (IMPREX), running from 2016-2019, a consortium of the Dutch research institute Deltares and the Dutch water management consultant HKV will design and build a tool to support quantitative risk-informed decision-making for fresh water management for the Netherlands, in particular the decision on water supply service levels. The research will be conducted in collaboration with the Dutch Ministry for Infrastructure and Environment, the Freshwater Supply Programme Office, the Dutch governmental organisation responsible for water management (Rijkswaterstaat), the Foundation for Applied Water Research, (STOWA, knowledge centre of the water boards) and a number of water boards. In the session we will present the conceptual framework for a risk-based approach for water shortage management and share thoughts on how the proposed tool can be applied in the Dutch water management context.

  18. Mobile human network management and recommendation by probabilistic social mining.

    PubMed

    Min, Jun-Ki; Cho, Sung-Bae

    2011-06-01

    Recently, inferring or sharing of mobile contexts has been actively investigated as cell phones have become more than a communication device. However, most of them focused on utilizing the contexts on social network services, while the means in mining or managing the human network itself were barely considered. In this paper, the SmartPhonebook, which mines users' social connections to manage their relationships by reasoning social and personal contexts, is presented. It works like an artificial assistant which recommends the candidate callees whom the users probably would like to contact in a certain situation. Moreover, it visualizes their social contexts like closeness and relationship with others in order to let the users know their social situations. The proposed method infers the social contexts based on the contact patterns, while it extracts the personal contexts such as the users' emotional states and behaviors from the mobile logs. Here, Bayesian networks are exploited to handle the uncertainties in the mobile environment. The proposed system has been implemented with the MS Windows Mobile 2003 SE Platform on Samsung SPH-M4650 smartphone and has been tested on real-world data. The experimental results showed that the system provides an efficient and informative way for mobile social networking. PMID:21172755

  19. Use of probabilistic risk assessment (PRA) in expert systems to advise nuclear plant operators and managers

    SciTech Connect

    Uhrig, R.E.

    1988-01-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. Generally, expert systems rely on the expertise of human experts or knowledge that has been modified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)/sup 3/ of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. 5 refs., 1 fig., 2 tabs.

  20. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; Vesely, William; Youngblood, Robert

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  1. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  2. Probabilistic scenario-based water resource planning and management:A case study in the Yellow River Basin, China

    NASA Astrophysics Data System (ADS)

    Dong, C.; Schoups, G.; van de Giesen, N.

    2012-04-01

    Water resource planning and management is subject to large uncertainties with respect to the impact of climate change and socio-economic development on water systems. In order to deal with these uncertainties, probabilistic climate and socio-economic scenarios were developed based on the Principle of Maximum Entropy, as defined within information theory, and as inputs to hydrological models to construct probabilistic water scenarios using Monte Carlo simulation. Probabilistic scenarios provide more explicit information than equally-likely scenarios for decision-making in water resource management. A case was developed for the Yellow River Basin, China, where future water availability and water demand are affected by both climate change and socio-economic development. Climate scenarios of future precipitation and temperature were developed based on the results of multiple Global climate models; and socio-economic scenarios were downscaled from existing large-scale scenarios. Probability distributions were assigned to these scenarios to explicitly represent a full set of future possibilities. Probabilistic climate scenarios were used as input to a rainfall-runoff model to simulate future river discharge and socio-economic scenarios for calculating water demand. A full set of possible future water supply-demand scenarios and their associated probability distributions were generated. This set can feed the further analysis of the future water balance, which can be used as a basis to plan and manage water resources in the Yellow River Basin. Key words: Probabilistic scenarios, climate change, socio-economic development, water management

  3. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  4. Spatial probabilistic multi-criteria decision making for assessment of flood management alternatives

    NASA Astrophysics Data System (ADS)

    Ahmadisharaf, Ebrahim; Kalyanapu, Alfred J.; Chung, Eun-Sung

    2016-02-01

    Flood management alternatives are often evaluated on the basis of flood parameters such as depth and velocity. As these parameters are uncertain, so is the evaluation of the alternatives. It is thus important to incorporate the uncertainty of flood parameters into the decision making frameworks. This research develops a spatial probabilistic multi-criteria decision making (SPMCDM) framework to demonstrate the impact of the design rainfall uncertainty on evaluation of flood management alternatives. The framework employs a probabilistic rainfall-runoff transformation model, a two-dimensional flood model and a spatial MCDM technique. Thereby, the uncertainty of decision making can be determined alongside the best alternative. A probability-based map is produced to show the discrete probability distribution function (PDF) of selecting each competing alternative. Overall the best at each grid cell is the alternative with the mode parameter of this PDF. This framework is demonstrated on the Swannanoa River watershed in North Carolina, USA and its results are compared to those of deterministic approach. While the deterministic framework fails to provide the uncertainty of selecting an alternative, the SPMCDM framework showed that in overall, selection of flood management alternatives in the watershed is "moderately uncertain". Moreover, three comparison metrics, F fit measure, κ statistic, and Spearman rank correlation coefficient (ρ), are computed to compare the results of these two approaches. An F fit measure of 62.6%, κ statistic of 15.4-45.0%, and spatial mean ρ value of 0.48, imply a significant difference in decision making by incorporating the design rainfall uncertainty through the presented SPMCDM framework. The SPMCDM framework can help decision makers to understand the uncertainty in selection of flood management alternatives.

  5. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  6. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. PMID:23777564

  7. Risk-Based Information to Support the Evaluation of Management Options for Cesium and Strontium Capsules at the Hanford Site

    SciTech Connect

    MacDonell, M.; Peterson, J.; Picel, M.; Douglas Hildebrand, R.

    2008-07-01

    Evaluations are under way to support U.S. Department of Energy decisions on how to manage cesium and strontium capsules currently in storage at the Hanford site. Health-based exposure limits for drinking water, oral toxicity data, and environmental fate information were combined in an initial evaluation to frame performance targets for managing chemicals and radionuclides that could leach from the capsules and migrate to groundwater over time. More than 50 relevant benchmarks were identified for 15 of the 17 contaminants in the study set. Of those multiple benchmarks, EPA limits for drinking water served as the main basis for the leachate performance targets. For the remaining two contaminants, stable cesium and zirconium, preliminary indicators were derived from a limited review of toxicity data. Thus, preliminary candidate concentrations were identified for the full study set to support the ongoing evaluation of capsule management options. In summary: In an earlier scoping study, three radionuclides and eight chemicals were identified as contaminants of interest for leachate from cesium and strontium capsules stored at the Hanford site. To frame management options for these capsules, it is assumed that contaminants will leach to groundwater and serve as a drinking water source in the long-term future. Before developing performance targets for the initial set of contaminants, a combined fate and toxicity evaluation was conducted to determine if any others should be added to account for decay or fate products and chemical toxicity. From this review, the list was expanded to produce a final study set of 17 contaminants. Established exposure limits and toxicity data were then reviewed and integrated to develop candidate health-based concentrations to frame performance targets for assessing options for long-term capsule management. This review of more than a dozen different benchmarks and toxicity sources translated to hundreds of individual data checks to support the

  8. Inexact joint-probabilistic stochastic programming for water resources management under uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Y. P.; Huang, G. H.

    2010-11-01

    In this study, an inexact two-stage integer program with joint-probabilistic constraint (ITIP-JPC) is developed for supporting water resources management under uncertainty. This method can tackle uncertainties expressed as joint probabilities and interval values, and can reflect the reliability of satisfying (or the risk of violating) system constraints under uncertain events and/or parameters. Moreover, it can be used for analysing various policy scenarios that are associated with different levels of economic consequences when the pre-regulated targets are violated. The developed ITIP-JPC is applied to a case study of water resources allocation within a multi-stream, multi-reservoir and multi-user context, where joint probabilities exist in both water availabilities and storage capacities. The results indicate that reasonable solutions have been generated for both binary and continuous variables. They can help generate desired policies for water allocation and flood diversion with a maximized economic benefit and a minimized system-disruption risk.

  9. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  10. Groundwater contamination from waste management sites: The interaction between risk-based engineering design and regulatory policy: 1. Methodology

    NASA Astrophysics Data System (ADS)

    Massmann, Joel; Freeze, R. Allan

    1987-02-01

    This paper puts in place a risk-cost-benefit analysis for waste management facilities that explicitly recognizes the adversarial relationship that exists in a regulated market economy between the owner/operator of a waste management facility and the government regulatory agency under whose terms the facility must be licensed. The risk-cost-benefit analysis is set up from the perspective of the owner/operator. It can be used directly by the owner/operator to assess alternative design strategies. It can also be used by the regulatory agency to assess alternative regulatory policy, but only in an indirect manner, by examining the response of an owner/operator to the stimuli of various policies. The objective function is couched in terms of a discounted stream of benefits, costs, and risks over an engineering time horizon. Benefits are in the form of revenues for services provided; costs are those of construction and operation of the facility. Risk is defined as the cost associated with the probability of failure, with failure defined as the occurrence of a groundwater contamination event that violates the licensing requirements established for the facility. Failure requires a breach of the containment structure and contaminant migration through the hydrogeological environment to a compliance surface. The probability of failure can be estimated on the basis of reliability theory for the breach of containment and with a Monte-Carlo finite-element simulation for the advective contaminant transport. In the hydrogeological environment the hydraulic conductivity values are defined stochastically. The probability of failure is reduced by the presence of a monitoring network operated by the owner/operator and located between the source and the regulatory compliance surface. The level of reduction in the probability of failure depends on the probability of detection of the monitoring network, which can be calculated from the stochastic contaminant transport simulations. While

  11. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  12. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance

  13. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  14. Use Of Probabilistic Risk Assessment (PRA) In Expert Systems To Advise Nuclear Plant Operators And Managers

    NASA Astrophysics Data System (ADS)

    Uhrig, Robert E.

    1988-03-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. f2 Generally, expert systems rely on the expertise of human experts or knowledge that has been codified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)3 of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. Nuclear power plants have many redundant systems and can continue to operate when one or more of these systems is disabled or removed from service for maintenance or testing. PRAs provide a means of evaluating the risk to the public associated with the operation of nuclear power plants with components or systems out of service. While the choice of the "source term" and methodology in a PRA may influence the absolute probability and consequences of a core melt, the ratio of two PRA calculations for two configurations of the same plant, carried out on a consistent basis, can readily identify the increase in risk associated with going from one configuration to the other. PRISIM,4 a personal computer program to calculate the ratio of core melt probabilities described above (based on previously performed PRAs), has been developed under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC). When one or several components are removed from service, PRISM then calculates the ratio of the core melt probabilities. The inference engine of the expert system then uses this ratio and a constant risk criterion,5 along with information from its knowledge base (which includes information from the PRA), to advise plant personnel as to what action, if any, should be taken.

  15. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  16. Risk-based and deterministic regulation

    SciTech Connect

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose.

  17. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils. PMID:23807560

  18. State Assistance with Risk-Based Data Management: Inventory and needs assessment of 25 state Class II Underground Injection Control programs. Phase 1

    SciTech Connect

    Not Available

    1992-07-01

    As discussed in Section I of the attached report, state agencies must decide where to direct their limited resources in an effort to make optimum use of their available manpower and address those areas that pose the greatest risk to valuable drinking water sources. The Underground Injection Practices Research Foundation (UIPRF) proposed a risk-based data management system (RBDMS) to provide states with the information they need to effectively utilize staff resources, provide dependable documentation to justify program planning, and enhance environmental protection capabilities. The UIPRF structured its approach regarding environmental risk management to include data and information from production, injection, and inactive wells in its RBDMS project. Data from each of these well types is critical to the complete statistical evaluation of environmental risk and selected automated functions. This comprehensive approach allows state Underground Injection Control (UIC) programs to effectively evaluate the risk of contaminating underground sources of drinking water, while alleviating the additional work and associated problems that often arise when separate data bases are used. CH2M Hill and Digital Design Group, through a DOE grant to the UIPRF, completed an inventory and needs assessment of 25 state Class II UIC programs. The states selected for participation by the UIPRF were generally chosen based on interest and whether an active Class II injection well program was in place. The inventory and needs assessment provided an effective means of collecting and analyzing the interest, commitment, design requirements, utilization, and potential benefits of implementing a in individual state UIC programs. Personal contacts were made with representatives from each state to discuss the applicability of a RBDMS in their respective state.

  19. The probabilistic seismic loss model as a tool for portfolio management: the case of Maghreb.

    NASA Astrophysics Data System (ADS)

    Pousse, Guillaume; Lorenzo, Francisco; Stejskal, Vladimir

    2010-05-01

    Although property insurance market in Maghreb countries does not systematically purchase an earthquake cover, Impact Forecasting is developing a new loss model for the calculation of probabilistic seismic risk. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Then, a set of damage functions is used to convert the modelled ground motion severity into monetary losses. We aim to highlight risk assessment challenges, especially in countries where reliable data are difficult to obtain. The loss model estimates the risk and allows discussing further risk transfer strategies.

  20. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  1. Risk-based configuration control: Application of PSA in improving technical specifications and operational safety

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Vesely, W.E.

    1992-11-01

    Risk-based configuration control is the management of component configurations using a risk perspective to control risk and assure safety. A configuration, as used here, is a set of component operability statuses that define the state of a nuclear power plant. If the component configurations that have high risk implications do not occur, then the risk from the operation of nuclear power plants would be minimal. The control of component configurations, i.e., the management of component statuses, to minimize the risk from components being unavailable, becomes difficult, because the status of a standby safety system component is often not apparent unless it is tested. Controlling plant configuration from a risk-perspective can provide more direct risk control and also more operational flexibility by allowing looser controls in areas unimportant to risk. Risk-based configuration control approaches can be used to replace parts of nuclear power plant Technical Specifications. With the advances in probabilistic safety assessment (PSA) technology, such approaches to improve Technical Specifications and operational safety are feasible. In this paper, we present an analysis of configuration risks, and a framework for risk-based configuration control to achieve the desired control of risk-significant configurations during plant operation.

  2. Risk-based configuration control: Application of PSA in improving technical specifications and operational safety

    SciTech Connect

    Samanta, P.K.; Kim, I.S. ); Vesely, W.E. )

    1992-01-01

    Risk-based configuration control is the management of component configurations using a risk perspective to control risk and assure safety. A configuration, as used here, is a set of component operability statuses that define the state of a nuclear power plant. If the component configurations that have high risk implications do not occur, then the risk from the operation of nuclear power plants would be minimal. The control of component configurations, i.e., the management of component statuses, to minimize the risk from components being unavailable, becomes difficult, because the status of a standby safety system component is often not apparent unless it is tested. Controlling plant configuration from a risk-perspective can provide more direct risk control and also more operational flexibility by allowing looser controls in areas unimportant to risk. Risk-based configuration control approaches can be used to replace parts of nuclear power plant Technical Specifications. With the advances in probabilistic safety assessment (PSA) technology, such approaches to improve Technical Specifications and operational safety are feasible. In this paper, we present an analysis of configuration risks, and a framework for risk-based configuration control to achieve the desired control of risk-significant configurations during plant operation.

  3. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  4. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  5. Coastal cliff recession: the use of probabilistic prediction methods

    NASA Astrophysics Data System (ADS)

    Lee, E. M.; Hall, J. W.; Meadowcroft, I. C.

    2001-10-01

    A range of probabilistic methods is introduced for predicting coastal cliff recession, which provide a means of demonstrating the potential variability in such predictions. They form the basis for risk-based land-use planning, cliff management and engineering decision-making. Examples of probabilistic models are presented for a number of different cliff settings: the simulation of recession on eroding cliffs; the use of historical records and statistical experiments to model the behaviour of cliffs affected by rare, episodic landslide events; the adaptation of an event tree approach to assess the probability of failure of protected cliffs, taking into account the residual life of the existing defences; and the evaluation of the probability of landslide reactivation in areas of pre-existing landslide systems. These methods are based on a geomorphological assessment of the episodic nature of the recession process, together with historical records.

  6. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  7. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  8. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  9. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  10. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  11. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  12. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  13. A risk-based focused decision-management approach for justifying characterization of Hanford tank waste. June 1996, Revision 1; April 1997, Revision 2

    SciTech Connect

    Colson, S.D.; Gephart, R.E.; Hunter, V.L.; Janata, J.; Morgan, L.G.

    1997-12-31

    This report describes a disciplined, risk-based decision-making approach for determining characterization needs and resolving safety issues during the storage and remediation of radioactive waste stored in Hanford tanks. The strategy recommended uses interactive problem evaluation and decision analysis methods commonly used in industry to solve problems under conditions of uncertainty (i.e., lack of perfect knowledge). It acknowledges that problem resolution comes through both the application of high-quality science and human decisions based upon preferences and sometimes hard-to-compare choices. It recognizes that to firmly resolve a safety problem, the controlling waste characteristics and chemical phenomena must be measurable or estimated to an acceptable level of confidence tailored to the decision being made.

  14. Improving nutrient management practices in agriculture: The role of risk-based beliefs in understanding farmers' attitudes toward taking additional action

    NASA Astrophysics Data System (ADS)

    Wilson, Robyn S.; Howard, Gregory; Burnett, Elizabeth A.

    2014-08-01

    A recent increase in the amount of dissolved reactive phosphorus (DRP) entering the western Lake Erie basin is likely due to increased spring storm events in combination with issues related to fertilizer application and timing. These factors in combination with warmer lake temperatures have amplified the spread of toxic algal blooms. We assessed the attitudes of farmers in northwest Ohio toward taking at least one additional action to reduce nutrient loss on their farm. Specifically, we (1) identified to what extent farm and farmer characteristics (e.g., age, gross farm sales) as well as risk-based beliefs (e.g., efficacy, risk perception) influenced attitudes, and (2) assessed how these characteristics and beliefs differ in their predictive ability based on unobservable latent classes of farmers. Risk perception, or a belief that negative impacts to profit and water quality from nutrient loss were likely, was the most consistent predictor of farmer attitudes. Response efficacy, or a belief that taking action on one's farm made a difference, was found to significantly influence attitudes, although this belief was particularly salient for the minority class of farmers who were older and more motivated by profit. Communication efforts should focus on the negative impacts of nutrient loss to both the farm (i.e., profit) and the natural environment (i.e., water quality) to raise individual perceived risk among the majority, while the minority need higher perceived efficacy or more specific information about the economic effectiveness of particular recommended practices.

  15. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  16. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  17. A generic probabilistic framework for structural health prognostics and uncertainty management

    NASA Astrophysics Data System (ADS)

    Wang, Pingfeng; Youn, Byeng D.; Hu, Chao

    2012-04-01

    Structural health prognostics can be broadly applied to various engineered artifacts in an engineered system. However, techniques and methodologies for health prognostics become application-specific. This study thus aims at formulating a generic framework of structural health prognostics, which is composed of four core elements: (i) a generic health index system with synthesized health index (SHI), (ii) a generic offline learning scheme using the sparse Bayes learning (SBL) technique, (iii) a generic online prediction scheme using the similarity-based interpolation (SBI), and (iv) an uncertainty propagation map for the prognostic uncertainty management. The SHI enables the use of heterogeneous sensory signals; the sparseness feature employing only a few neighboring kernel functions enables the real-time prediction of remaining useful lives (RULs) regardless of data size; the SBI predicts the RULs with the background health knowledge obtained under uncertain manufacturing and operation conditions; and the uncertainty propagation map enables the predicted RULs to be loaded with their statistical characteristics. The proposed generic framework of structural health prognostics is thus applicable to different engineered systems and its effectiveness is demonstrated with two cases studies.

  18. Converting probabilistic tree species range shift projections into meaningful classes for management.

    PubMed

    Hanewinkel, Marc; Cullmann, Dominik A; Michiels, Hans-Gerd; Kändler, Gerald

    2014-02-15

    The paper deals with the management problem how to decide on tree species suitability under changing environmental conditions. It presents an algorithm that classifies the output of a range shift model for major tree species in Europe into multiple classes that can be linked to qualities characterizing the ecological niche of the species. The classes: i) Core distribution area, ii) Extended distribution area, iii) Occasional occurrence area, and iv) No occurrence area are first theoretically developed and then statistically described. The classes are interpreted from an ecological point of view using criteria like population structure, competitive strength, site spectrum and vulnerability to biotic hazards. The functioning of the algorithm is demonstrated using the example of a generalized linear model that was fitted to a pan-European dataset of presence/absence of major tree species with downscaled climate data from a General Circulation Model (GCM). Applications of the algorithm to tree species suitability classification on a European and regional level are shown. The thresholds that are used by the algorithm are precision-based and include Cohen's Kappa. A validation of the algorithm using an independent dataset of the German National Forest Inventory shows good accordance of the statistically derived classes with ecological traits for Norway spruce, while the differentiation especially between core and extended distribution for European beech that is in the centre of its natural range in this area is less accurate. We hypothesize that for species in the core of their range regional factors like forest history superimpose climatic factors. Problems of uncertainty issued from potentially applying a multitude of modelling approaches and/or climate realizations within the range shift model are discussed and a way to deal with the uncertainty by revealing the underlying attitude towards risk of the decision maker is proposed. PMID:24486469

  19. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  20. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  1. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  2. Overview of the co-ordinated risk-based approach to science and management response and recovery for the 2012 eruptions of Tongariro volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, G. E.; Keys, H. J. R.; Procter, J. N.; Deligne, N. I.

    2014-10-01

    Tongariro volcano, New Zealand, lies wholly within the Tongariro National Park (TNP), one of New Zealand's major tourist destinations. Two small eruptions of the Te Maari vents on the northern flanks of Tongariro on 6 August 2012 and 21 November 2012 each produced a small ash cloud to < 8 km height accompanied by pyroclastic density currents and ballistic projectiles. The most popular day hike in New Zealand, the Tongariro Alpine Crossing (TAC), runs within 2 km of the Te Maari vents. The larger of the two eruptions (6 August 2012) severely impacted the TAC and resulted in its closure, impacting the local economic and potentially influencing national tourism. In this paper, we document the science and risk management response to the eruption, and detail how quantitative risk assessments were applied in a rapidly evolving situation to inform robust decision-making for when the TAC would be re-opened. The volcanologist and risk manager partnership highlights the value of open communication between scientists and stakeholders during a response to, and subsequent recovery from, a volcanic eruption.

  3. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    SciTech Connect

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  4. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  5. Risk-based system refinement

    SciTech Connect

    Winter, V.L.; Berg, R.S.; Dalton, L.J.

    1998-06-01

    When designing a high consequence system, considerable care should be taken to ensure that the system can not easily be placed into a high consequence failure state. A formal system design process should include a model that explicitly shows the complete state space of the system (including failure states) as well as those events (e.g., abnormal environmental conditions, component failures, etc.) that can cause a system to enter a failure state. In this paper the authors present such a model and formally develop a notion of risk-based refinement with respect to the model.

  6. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  7. Risk Based Security Management at Research Reactors

    SciTech Connect

    Ek, David R.

    2015-09-01

    This presentation provides a background of what led to the international emphasis on nuclear security and describes how nuclear security is effectively implemented so as to preserve the societal benefits of nuclear and radioactive materials.

  8. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  9. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  10. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  11. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  12. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering. PMID:25500464

  13. Risk Based Inspection Pilot Study of Ignalina Nuclear Power Plant,Unit 2

    SciTech Connect

    Brickstad, Bjorn; Letzter, Adam; Klimasauskas, Arturas; Alzbutas, Robertas; Nedzinskas, Linas; Kopustinskas, Vytis

    2002-07-01

    A project with the acronym IRBIS (Ignalina Risk Based Inspection pilot Study) has been performed with the objective to perform a quantitative risk analysis of a total of 1240 stainless steel welds in Ignalina Nuclear Power Plant, unit 2 (INPP-2). The damage mechanism is IGSCC and the failure probabilities are quantified by using probabilistic fracture mechanics. The conditional core damage probabilities are taken from the plant PSA. (authors)

  14. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  15. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  16. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties. PMID:25983196

  17. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology. PMID:26922500

  18. Risk based inspection for atmospheric storage tank

    NASA Astrophysics Data System (ADS)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  19. Potential advantages associated with implementing a risk-based inspection program by a nuclear facility

    NASA Astrophysics Data System (ADS)

    McNeill, Alexander, III; Balkey, Kenneth R.

    1995-05-01

    The current inservice inspection activities at a U.S. nuclear facility are based upon the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI. The Code selects examination locations based upon a sampling criteria which includes component geometry, stress, and usage among other criteria. This can result in a significant number of required examinations. As a result of regulatory action each nuclear facility has conducted probabilistic risk assessments (PRA) or individual plant examinations (IPE), producing plant specific risk-based information. Several initiatives have been introduced to apply this new plant risk information. Among these initiatives is risk-based inservice inspection. A code case has been introduced for piping inspections based upon this new risk- based technology. This effort brought forward to the ASME Section XI Code committee, has been initiated and championed by the ASME Research Task Force on Risk-Based Inspection Guidelines -- LWR Nuclear Power Plant Application. Preliminary assessments associated with the code case have revealed that potential advantages exist in a risk-based inservice inspection program with regard to a number of exams, risk, personnel exposure, and cost.

  20. A probabilistic and multi-objective conceptual design methodology for the evaluation of thermal management systems on air-breathing hypersonic vehicles

    NASA Astrophysics Data System (ADS)

    Ordaz, Irian

    This thesis addresses the challenges associated with thermal management systems (TMS) evaluation and selection in the conceptual design of hypersonic, air-breathing vehicles with sustained cruise. The proposed methodology identifies analysis tools and techniques which allow the proper investigation of the design space for various thermal management technologies. The design space exploration environment and alternative multi-objective decision making technique defined as Pareto-based Joint Probability Decision Making (PJPDM) is based on the approximation of 3-D Pareto frontiers and probabilistic technology effectiveness maps. These are generated through the evaluation of a Pareto Fitness function and Monte Carlo analysis. In contrast to Joint Probability Decision Making (JPDM), the proposed PJPDM technique does not require preemptive knowledge of weighting factors for competing objectives or goal constraints which can introduce bias into the final solution. Preemptive bias in a complex problem can degrade the overall capabilities of the final design. The implementation of PJPDM in this thesis eliminates the need for the numerical optimizer which is required with JPDM in order to improve upon a solution. In addition, a physics-based formulation is presented for the quantification of TMS safety effectiveness corresponding to debris impact/damage and how it can be applied towards risk mitigation. Lastly, a formulation loosely based on non-preemptive Goal Programming with equal weighted deviations is provided for the resolution of the inverse design space. This key step helps link vehicle capabilities to TMS technology subsystems in a top-down design approach. The methodology provides the designer more knowledge up front to help make proper engineering decisions and assumptions in the conceptual design phase regarding which technologies show greatest promise, and how to guide future technology research.

  1. Risk-based inservice testing program modifications at Palo Verde nuclear generating station

    SciTech Connect

    Knauf, S.; Lindenlaub, B.; Linthicum, R.

    1996-12-01

    Arizona Public Service Company (APS) is investigating changes to the Palo Verde Inservice Testing (IST) Program that are intended to result in the reduction of the required test frequency for various valves in the American Society of Mechanical Engineers (ASME) Section XI IST program. The analytical techniques employed to select candidate valves and to demonstrate that these frequency reductions are acceptable are risk based. The results of the Palo Verde probabilistic risk assessment (PRA), updated in June 1994, and the risk significant determination performed as part of the implementation efforts for 10 CFR 50.65 (the maintenance rule) were used to select candidate valves for extended test intervals. Additional component level evaluations were conducted by an `expert panel.` The decision to pursue these changes was facilitated by the ASME Risk-Based Inservice Testing Research Task Force for which Palo Verde is participating as a pilot plant. The NRC`s increasing acceptance of cost beneficial licensing actions and risk-based submittals also provided incentive to seek these changes. Arizona Public Service is pursuing the risk-based IST program modification in order to reduce the unnecessary regulatory burden of the IST program through qualitative and quantitative analysis consistent with maintaining a high level of plant safety. The objectives of this project at Palo Verde are as follows: (1) Apply risk-based technologies to IST components to determine their risk significance (i.e., high or low). (2) Apply a combination of deterministic and risk-based methods to determine appropriate testing requirements for IST components including improvement of testing methods and frequency intervals for high-risk significant components. (3) Apply risk-based technologies to high-risk significant components identified by the {open_quotes}expert panel{close_quotes} and outside of the IST program to determine whether additional testing requirements are appropriate.

  2. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  3. Probabilistic comparison of alternative characterization technologies at the Fernald Uranium-in-Soils Integrated Demonstration Project

    SciTech Connect

    Rautman, C.A.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.; Kaplan, P.G.

    1993-12-31

    The performance of four alternative characterization technologies proposed for use in characterization of surficial uranium contamination in soil at the Incinerator and Drum Baling Areas at the Fernald Environmental Management Project in southwestern Ohio has been evaluated using a probabilistic, risk-based decision-analysis methodology. The basis of comparison is to minimize a computed total cost for environmental cleanup. This total-cost-based approach provides a framework for evaluating the trade-offs among remedial investigation, the remedial design, and the risk of regulatory penalties. The approach explicitly recognizes the value of information provided by remedial investigation; additional measurements are only valuable to the extent that the information they provide reduces total cost.

  4. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  5. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  6. Challenges in using probabilistic climate change information for impact assessments: an example from the water sector.

    PubMed

    New, Mark; Lopez, Ana; Dessai, Suraje; Wilby, Rob

    2007-08-15

    Climate change impacts and adaptation assessments have traditionally adopted a scenario-based approach, which precludes an assessment of the relative risks of particular adaptation options. Probabilistic impact assessments, especially if based on a thorough analysis of the uncertainty in an impact forecast system, enable adoption of a risk-based assessment framework. However, probabilistic impacts information is conditional and will change over time. We explore the implications of a probabilistic end-to-end risk-based framework for climate impacts assessment, using the example of water resources in the Thames River, UK. We show that a probabilistic approach provides more informative results that enable the potential risk of impacts to be quantified, but that details of the risks are dependent on the approach used in the analysis. PMID:17569650

  7. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  10. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  11. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  12. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  13. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  14. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  15. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  16. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  17. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  18. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  19. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  20. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis. PMID:9433667

  1. 12 CFR 1750.13 - Risk-based capital level computation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... section to the Enterprise. (2) Management and Operations Risk. To provide for management and operations... Section 1750.13 Banks and Banking OFFICE OF FEDERAL HOUSING ENTERPRISE OVERSIGHT, DEPARTMENT OF HOUSING... Enterprise at least quarterly by applying the risk-based capital test described in appendix A to this...

  2. Application of risk-based methods to inservice inspection of piping systems

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; Oswald, E.; West, R.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice inspection (ISI) programs for piping systems in nuclear power plants. This paper discusses a pilot application of these methods to the inservice inspection of piping systems of Northeast Utilities Millstone Unit 3 nuclear power station. This demonstration study, which has been sponsored by the Westinghouse Owners Group (WOG), applies probabilistic safety assessment (PSA) models that have already been developed to meet regulatory requirements for an individual plant examination (IPE). The approach calculates the relative importance for each component within the systems of interest. This risk-importance is based on the frequency of core damage resulting from the structural failure of the component. The process inductively determines the effects that such failures have on the desired operational characteristics of the system being analyzed. Structural reliability/risk assessment (SRRA) models based on probabilistic structural mechanics methods are used to estimate failure probabilities for important components. Locations within a system with varying failure probabilities can be defined to focus ISI resources. This paper will discuss the above process and results to show that application of risk-based methods in the development of ISI programs can potentially result in significant savings while maintaining a high level of safety.

  3. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  4. Expert system development for probabilistic load simulation

    NASA Technical Reports Server (NTRS)

    Ho, H.; Newell, J. F.

    1991-01-01

    A knowledge based system LDEXPT using the intelligent data base paradigm was developed for the Composite Load Spectra (CLS) project to simulate the probabilistic loads of a space propulsion system. The knowledge base approach provides a systematic framework of organizing the load information and facilitates the coupling of the numerical processing and symbolic (information) processing. It provides an incremental development environment for building generic probabilistic load models and book keeping the associated load information. A large volume of load data is stored in the data base and can be retrieved and updated by a built-in data base management system. The data base system standardizes the data storage and retrieval procedures. It helps maintain data integrity and avoid data redundancy. The intelligent data base paradigm provides ways to build expert system rules for shallow and deep reasoning and thus provides expert knowledge to help users to obtain the required probabilistic load spectra.

  5. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  6. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  7. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  8. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  9. Performance- and risk-based regulation

    SciTech Connect

    Sauter, G.D.

    1994-12-31

    Risk-based regulation (RBR) and performance-based regulation (PBR) are two relatively new concepts for the regulation of nuclear reactor power plants by the U.S. Nuclear Regulatory Commission (NRC). Although RBR and PBR are often considered to be somewhat equivalent, they, in fact, address two fundamentally different regulatory questions. To fruitfully discuss these two concepts, it is important to recognize what each entails. This paper identifies those two fundamental questions and discusses how they are addressed by RBR and PBR.

  10. Risk-based monitored natural attenuation--a case study.

    PubMed

    Khan, F I; Husain, T

    2001-08-17

    The term "monitored natural attenuation" (MNA) refers to a reliance on natural attenuation (NA) processes for remediation through the careful monitoring of the behavior of a contaminant source in time and space domains. In recent years, policymakers are shifting to a risk-based approach where site characteristics are measured against the potential risk to human health and the environment, and site management strategies are prioritized to be commensurate with that risk. Risk-based corrective action (RBCA), a concept developed by the American Society for Testing Materials (ASTM), was the first indication of how this approach could be used in the development of remediation strategies. This paper, which links ASTM's RBCA approach with MNA, develops a systematic working methodology for a risk-based site evaluation and remediation through NA. The methodology is comprised of seven steps, with the first five steps intended to evaluate site characteristics and the feasibility of NA. If NA is effective, then the last two steps will guide the development of a long-term monitoring plan and approval for a site closure. This methodology is used to evaluate a site contaminated with oil from a pipeline spill. The case study concluded that the site has the requisite characteristics for NA, but it would take more than 80 years for attenuation of xylene and ethylbenzene, as these chemicals appear in the pure phase. If fast remediation is sought, then efforts should be made to remove the contaminant from the soil. Initially, the site posed a serious risk to both on-site and off-site receptors, but it becomes acceptable after 20 years, as the plume is diluted and drifts from its source of origin. PMID:11489527

  11. Probabilistic Approaches: Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process through composite mechanics, and structural component. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength. For example, results show that: in situ fiber tensile strength is 90 percent of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables; a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide spread scatter at 90 percent cyclic-stress to static-strength ratios.

  12. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  13. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  14. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  15. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  16. Probabilistic composite analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.

    1991-01-01

    Formal procedures are described which are used to computationally simulate the probabilistic behavior of composite structures. The computational simulation starts with the uncertainties associated with all aspects of a composite structure (constituents, fabrication, assembling, etc.) and encompasses all aspects of composite behavior (micromechanics, macromechanics, combined stress failure, laminate theory, structural response, and tailoring) optimization. Typical cases are included to illustrate the formal procedure for computational simulation. The collective results of the sample cases demonstrate that uncertainties in composite behavior and structural response can be probabilistically quantified.

  17. Risk-based targeting: A new approach in environmental protection

    SciTech Connect

    Fox, C.A.

    1995-12-31

    Risk-based targeting has recently emerged as an effective tool to help prioritize efforts to identify and manage geographic areas, chemicals, facilities, and agricultural activities that cause the most environmental degradation. This paper focuses on how the Environmental Protection Agency (EPA) has recently used risk-based targeting to identify and screen Federal, industrial, commercial and municipal facilities which contribute to probable human health (fish consumption advisories and contaminated fish tissue) and aquatic life (contaminated sediments) impacts. Preliminary results identified several hundred potential contributors of problem chemicals to probable impacts within the same river reach in 1991--93. Analysis by industry sector showed that the majority of the facilities identified were publicly owned treatment works (POTWs), in addition to industry organic and inorganic chemical manufacturers, petroleum refineries, and electric services, coatings, engravings, and allied services, among others. Both compliant and non-compliant potentially contributing facilities were identified to some extent in all EPA regions. Additional results identifying possible linkages of other pollutant sources to probable impacts, as well as estimation of potential exposure of these contaminants to minority and/or poverty populations are also presented. Out of these analyses, a number of short and long-term strategies are being developed that EPA may use to reduce loadings of problem contaminants to impacted waterbodies.

  18. Study of operational risk-based configuration control

    SciTech Connect

    Vesely, W E; Samanta, P K; Kim, I S

    1991-08-01

    This report studies aspects of a risk-based configuration control system to detect and control plant configurations from a risk perspective. Configuration control, as the term is used here, is the management of component configurations to achieve specific objectives. One important objective is to control risk and safety. Another is to operate efficiently and make effective use of available resources. PSA-based evaluations are performed to study configuration to core-melt frequency and core-melt probability for two plants. Some equipment configurations can cause large core-melt frequency and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the core-melt probability contributions are also generally small. The insights from this evaluation are used to develop the framework for an effective risk-based configuration control system. The focal points of such a system and the requirements for tools development for implementing the system are defined. The requirements of risk models needed for the system, and the uses of plant-specific data are also discussed. 18 refs., 25 figs., 10 tabs.

  19. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  20. Probabilistic, Multidimensional Unfolding Analysis

    ERIC Educational Resources Information Center

    Zinnes, Joseph L.; Griggs, Richard A.

    1974-01-01

    Probabilistic assumptions are added to single and multidimensional versions of the Coombs unfolding model for preferential choice (Coombs, 1950) and practical ways of obtaining maximum likelihood estimates of the scale parameters and goodness-of-fit tests of the model are presented. A Monte Carlo experiment is discussed. (Author/RC)

  1. Concepts for risk-based surveillance in the field of veterinary medicine and veterinary public health: Review of current approaches

    PubMed Central

    Stärk, Katharina DC; Regula, Gertraud; Hernandez, Jorge; Knopf, Lea; Fuchs, Klemens; Morris, Roger S; Davies, Peter

    2006-01-01

    Background Emerging animal and zoonotic diseases and increasing international trade have resulted in an increased demand for veterinary surveillance systems. However, human and financial resources available to support government veterinary services are becoming more and more limited in many countries world-wide. Intuitively, issues that present higher risks merit higher priority for surveillance resources as investments will yield higher benefit-cost ratios. The rapid rate of acceptance of this core concept of risk-based surveillance has outpaced the development of its theoretical and practical bases. Discussion The principal objectives of risk-based veterinary surveillance are to identify surveillance needs to protect the health of livestock and consumers, to set priorities, and to allocate resources effectively and efficiently. An important goal is to achieve a higher benefit-cost ratio with existing or reduced resources. We propose to define risk-based surveillance systems as those that apply risk assessment methods in different steps of traditional surveillance design for early detection and management of diseases or hazards. In risk-based designs, public health, economic and trade consequences of diseases play an important role in selection of diseases or hazards. Furthermore, certain strata of the population of interest have a higher probability to be sampled for detection of diseases or hazards. Evaluation of risk-based surveillance systems shall prove that the efficacy of risk-based systems is equal or higher than traditional systems; however, the efficiency (benefit-cost ratio) shall be higher in risk-based surveillance systems. Summary Risk-based surveillance considerations are useful to support both strategic and operational decision making. This article highlights applications of risk-based surveillance systems in the veterinary field including food safety. Examples are provided for risk-based hazard selection, risk-based selection of sampling strata as

  2. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  3. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  4. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  5. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  6. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  7. Probabilistic simple splicing systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2014-06-01

    A splicing system, one of the early theoretical models for DNA computing was introduced by Head in 1987. Splicing systems are based on the splicing operation which, informally, cuts two strings of DNA molecules at the specific recognition sites and attaches the prefix of the first string to the suffix of the second string, and the prefix of the second string to the suffix of the first string, thus yielding the new strings. For a specific type of splicing systems, namely the simple splicing systems, the recognition sites are the same for both strings of DNA molecules. It is known that splicing systems with finite sets of axioms and splicing rules only generate regular languages. Hence, different types of restrictions have been considered for splicing systems in order to increase their computational power. Recently, probabilistic splicing systems have been introduced where the probabilities are initially associated with the axioms, and the probabilities of the generated strings are computed from the probabilities of the initial strings. In this paper, some properties of probabilistic simple splicing systems are investigated. We prove that probabilistic simple splicing systems can also increase the computational power of the splicing languages generated.

  8. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  9. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  10. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  11. Risk-Based Data Management System design specifications and implementation plan for the Alaska Oil and Gas Conservation Commission; the Mississippi State Oil and Gas Board; the Montana Board of Oil and Gas Conservation; and the Nebraska Oil and Gas Conservation Commission

    SciTech Connect

    Not Available

    1993-09-01

    The purpose of this document is to present design specifications and an implementation schedule for the development and implementation of Risk Based Data Management Systems (RBDMS`s) in the states of Alaska, Mississippi, Montana, and Nebraska. The document presents detailed design information including a description of the system database structure, data dictionary, data entry and inquiry screen layouts, specifications for standard reports that will be produced by the system, functions and capabilities (including environmental risk analyses), And table relationships for each database table within the system. This design information provides a comprehensive blueprint of the system to be developed and presents the necessary detailed information for system development and implementation. A proposed schedule for development and implementation also is presented. The schedule presents timeframes for the development of system modules, training, implementation, and providing assistance to the states with data conversion from existing systems. However, the schedule will vary depending upon the timing of funding allocations from the United States Department of Energy (DOE) for the development and implementation phase of the project. For planning purposes, the schedule assumes that initiation of the development and implementation phase will commence November 1, 1993, somewhat later than originally anticipated.

  12. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  13. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  14. Probabilistic analysis of mechanical systems

    SciTech Connect

    Priddy, T.G.; Paez, T.L.; Veers, P.S.

    1993-09-01

    This paper proposes a framework for the comprehensive analysis of complex problems in probabilistic structural mechanics. Tools that can be used to accurately estimate the probabilistic behavior of mechanical systems are discussed, and some of the techniques proposed in the paper are developed and used in the solution of a problem in nonlinear structural dynamics.

  15. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  16. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  17. Probabilistic Finite Element: Variational Theory

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.

    1985-01-01

    The goal of this research is to provide techniques which are cost-effective and enable the engineer to evaluate the effect of uncertainties in complex finite element models. Embedding the probabilistic aspects in a variational formulation is a natural approach. In addition, a variational approach to probabilistic finite elements enables it to be incorporated within standard finite element methodologies. Therefore, once the procedures are developed, they can easily be adapted to existing general purpose programs. Furthermore, the variational basis for these methods enables them to be adapted to a wide variety of structural elements and to provide a consistent basis for incorporating probabilistic features in many aspects of the structural problem. Tasks concluded include the theoretical development of probabilistic variational equations for structural dynamics, the development of efficient numerical algorithms for probabilistic sensitivity displacement and stress analysis, and integration of methodologies into a pilot computer code.

  18. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    NASA Astrophysics Data System (ADS)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  19. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  20. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  1. Risk-based inspection of pressurizer surge lines

    NASA Astrophysics Data System (ADS)

    Shah, Nitin J.; Dwivedy, Keshab K.

    1996-11-01

    The Reactor Coolant System (RCS) piping of a pressurized water reactor (PWR) plant is probably the best in terms of resistance to known degradation mechanisms of passive components. However, a failure in the RCS piping is extremely important in terms of safety and economic significance. Therefore, an effective management tool is needed to mitigate the potential effects of degradation due to aging or other effects such that plant reliability and availability are not affected. Currently, the RCS piping of all US PWR plants is being subjected to inservice inspection (ISI) based upon certain deterministics criteria set by the ASME code and the NRC regulatory guide. Even though the history of large RCS piping has not shown any degradation, the ISI continues at many locations at greta expense to the plant owners whereas, there can be only a few locations of relatively high vulnerability. A risk based ISI can provide an alternative and cost-effective solution in this situation. Pressurizer surge line is a unique segment in the RCS which is subjected to significant transient loadings due to stratification and striping during the normal heatup and cooldown processes. Therefore, the surge line is considered for illustration. Examples of structural reliability studies of pressurizer surge lines in four PWR units are presented in this paper to demonstrate possible reduction of ISI and significant cost saving without reduction of plant safety or reliability.

  2. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  3. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  4. Probabilistic fracture finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-01-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  5. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  6. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  7. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  8. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  9. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  10. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  11. Do probabilistic forecasts lead to better decisions? Try it yourself!

    NASA Astrophysics Data System (ADS)

    van Andel, S. J.; Pappenberger, F.; Ramos, M. H.; Thielen, J.

    2012-04-01

    The last decade has seen much research in producing and increasing the reliability of probabilistic hydro-meteorological forecasts following the promise that armed with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating uncertain forecasts includes preparing tools and products for visualization but also understanding how forecasters perceive and use uncertain information in real-time for decision-making. The question of proper communication has by no means been conclusively answered, nor has the question of improved decision making. In this presentation we will engage in a small but exciting live experiment in which several cases of flood forecasts and a multiple choice of actions will be presented to participants, who will act as decision makers. Answers will be collected and analyzed directly. Results will be presented and discussed together with the participants of the session to see if indeed we make better decisions on the basis of probabilistic forecasts.

  12. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  13. Application of risk-based methods to optimize inspection planning for regulatory activities at nuclear power plants

    SciTech Connect

    Wong, S.M.; Higgins, J.C.; Martinez-Guridi, G.

    1995-07-01

    As part of regulatory oversight requirements, the U.S. Nuclear Regulatory Commission (USNRC) staff conducts inspection activities to assess operational safety performance in nuclear power plants. Currently, guidance in these inspections is provided by procedures in the NRC Inspection Manual and issuance of Temporary Instructions defining the objectives and scope of the inspection effort. In several studies sponsored by the USNRC over the last few years, Brookhaven National Laboratory (BNL) has developed and applied methodologies for providing risk-based inspection guidance for the safety assessments of nuclear power plant systems. One recent methodology integrates insights from existing Probabilistic Risk Assessment (PRA) studies and Individual Plant Evaluations (TPE) with information from operating experience reviews for consideration in inspection planning for either multi-disciplinary team inspections or individual inspections. In recent studies at BNL, a risk-based methodology was developed to optimize inspection planning for regulatory activities at nuclear power plants. This methodology integrates risk-based insights from the plant configuration risk profile and risk information found in existing PRA/IPE studies.

  14. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  15. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  16. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  17. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  18. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  19. Risky business: the risk-based, risk-sharing capitated HMO.

    PubMed

    Kazahaya, G I

    1986-08-01

    Hospitals are encountering a new type of HMO--the risk-based, risk-sharing capitated HMO. This new HMO arrangement redefines the role of the hospital, the physicians, and the HMO plan involved. Instead of placing the HMO at risk, the hospital and physicians are now financially responsible for services covered under the HMO plan. The capitated HMO is reduced to a third-party payer, serving as a broker between subscribers and providers. In this first of two articles on capitated HMOs, the risk-based, risk-sharing capitated HMO and its relationship to hospitals and physicians is defined. The second article will take this definition and apply it to managing, monitoring, and reporting on these types of programs from an accounting perspective. PMID:10277301

  20. Orbitofrontal or accumbens dopamine depletion does not affect risk-based decision making in rats.

    PubMed

    Mai, Bettina; Hauber, Wolfgang

    2015-09-01

    Considerable evidence has implicated dopamine (DA) signals in target regions of midbrain DA neurons such as the medial prefrontal cortex or the core region of the nucleus accumbens in controlling risk-based decision-making. However, to date little is known about the contribution of DA in the orbitofrontal cortex (OFC) and the medial shell region of the nucleus accumbens (AcbS) to risk-based decision-making. Here we examined in rats the effects of 6-hydroxydopamine-induced DA depletions of the OFC and AcbS on risky choice using an instrumental two-lever choice task that requires the assessment of fixed within-session reward probabilities that were shifted across subsequent sessions, i.e., rats had to choose between two levers, a small/certain lever that delivered one certain food reward (one pellet at p = 1) and a large/risky lever that delivered a larger uncertain food reward with decreasing probabilities across subsequent sessions (four pellets at p = 0.75, 0.5, 0.25, 0.125, 0.0625). Results show that systemic administration of amphetamine or cocaine increased the preference for the large/risky lever. Results further demonstrate that, like sham controls, rats with OFC or AcbS DA depletion were sensitive to changes in probabilities for obtaining the large/risky reward across sessions and displayed probabilistic discounting. These findings point to the view that the basal capacity to evaluate the magnitude and likelihood of rewards associated with alternative courses of action as well as long-term changes of reward probabilities does not rely on DA input to the AcbS or OFC. PMID:25860659

  1. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  2. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches. PMID:21609883

  3. A risk-based approach to scheduling audits.

    PubMed

    Rönninger, Stephan; Holmes, Malcolm

    2009-01-01

    The manufacture and supply of pharmaceutical products can be a very complex operation. Companies may purchase a wide variety of materials, from active pharmaceutical ingredients to packaging materials, from "in company" suppliers or from third parties. They may also purchase or contract a number of services such as analysis, data management, audit, among others. It is very important that these materials and services are of the requisite quality in order that patient safety and company reputation are adequately protected. Such quality requirements are ongoing throughout the product life cycle. In recent years, assurance of quality has been derived via audit of the supplier or service provider and by using periodic audits, for example, annually or at least once every 5 years. In the past, companies may have used an audit only for what they considered to be "key" materials or services and used testing on receipt, for example, as their quality assurance measure for "less important" supplies. Such approaches changed as a result of pressure from both internal sources and regulators to the time-driven audit for all suppliers and service providers. Companies recognised that eventually they would be responsible for the quality of the supplied product or service and audit, although providing only a "snapshot in time" seemed a convenient way of demonstrating that they were meeting their obligations. Problems, however, still occur with the supplied product or service and will usually be more frequent from certain suppliers. Additionally, some third-party suppliers will no longer accept routine audits from individual companies, as the overall audit load can exceed one external audit per working day. Consequently a different model is needed for assessing supplier quality. This paper presents a risk-based approach to creating an audit plan and for scheduling the frequency and depth of such audits. The approach is based on the principles and process of the Quality Risk Management

  4. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  5. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  6. Rule Learning with Probabilistic Smoothing

    NASA Astrophysics Data System (ADS)

    Costa, Gianni; Guarascio, Massimo; Manco, Giuseppe; Ortale, Riccardo; Ritacco, Ettore

    A hierarchical classification framework is proposed for discriminating rare classes in imprecise domains, characterized by rarity (of both classes and cases), noise and low class separability. The devised framework couples the rules of a rule-based classifier with as many local probabilistic generative models. These are trained over the coverage of the corresponding rules to better catch those globally rare cases/classes that become less rare in the coverage. Two novel schemes for tightly integrating rule-based and probabilistic classification are introduced, that classify unlabeled cases by considering multiple classifier rules as well as their local probabilistic counterparts. An intensive evaluation shows that the proposed framework is competitive and often superior in accuracy w.r.t. established competitors, while overcoming them in dealing with rare classes.

  7. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  8. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  9. Application of the risk-based strategy to the Hanford tank waste organic-nitrate safety issue

    SciTech Connect

    Hunter, V.L.; Colson, S.D.; Ferryman, T.; Gephart, R.E.; Heasler, P.; Scheele, R.D.

    1997-12-01

    This report describes the results from application of the Risk-Based Decision Management Approach for Justifying Characterization of Hanford Tank Waste to the organic-nitrate safety issue in Hanford single-shell tanks (SSTs). Existing chemical and physical models were used, taking advantage of the most current (mid-1997) sampling and analysis data. The purpose of this study is to make specific recommendations for planning characterization to help ensure the safety of each SST as it relates to the organic-nitrate safety issue. An additional objective is to demonstrate the viability of the Risk-Based Strategy for addressing Hanford tank waste safety issues.

  10. Probabilistic framework for network partition

    NASA Astrophysics Data System (ADS)

    Li, Tiejun; Liu, Jian; E, Weinan

    2009-08-01

    Given a large and complex network, we would like to find the partition of this network into a small number of clusters. This question has been addressed in many different ways. In a previous paper, we proposed a deterministic framework for an optimal partition of a network as well as the associated algorithms. In this paper, we extend this framework to a probabilistic setting, in which each node has a certain probability of belonging to a certain cluster. Two classes of numerical algorithms for such a probabilistic network partition are presented and tested. Application to three representative examples is discussed.

  11. Probabilistic coding of quantum states

    SciTech Connect

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-07-15

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding.

  12. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  13. Role of Context in Risk-Based Reasoning

    ERIC Educational Resources Information Center

    Pratt, Dave; Ainley, Janet; Kent, Phillip; Levinson, Ralph; Yogui, Cristina; Kapadia, Ramesh

    2011-01-01

    In this article we report the influence of contextual factors on mathematics and science teachers' reasoning in risk-based decision-making. We examine previous research that presents judgments of risk as being subjectively influenced by contextual factors and other research that explores the role of context in mathematical problem-solving. Our own…

  14. How Should Risk-Based Regulation Reflect Current Public Opinion?

    PubMed

    Pollock, Christopher John

    2016-08-01

    Risk-based regulation of novel agricultural products with public choice manifest via traceability and labelling is a more effective approach than the use of regulatory processes to reflect public concerns, which may not always be supported by evidence. PMID:27266813

  15. Development of a risk-based approach to Hanford Site cleanup

    SciTech Connect

    Hesser, W.A.; Daling, P.M.; Baynes, P.A.

    1995-06-01

    In response to a request from Mr. Thomas Grumbly, Assistant Secretary of Energy for Environmental Management, the Hanford Site contractors developed a conceptual set of risk-based cleanup strategies that (1) protect the public, workers, and environment from unacceptable risks; (2) are executable technically; and (3) fit within an expected annual funding profile of 1.05 billion dollars. These strategies were developed because (1) the US Department of Energy and Hanford Site budgets are being reduced, (2) stakeholders are dissatisfied with the perceived rate of cleanup, (3) the US Congress and the US Department of Energy are increasingly focusing on risk and riskreduction activities, (4) the present strategy is not integrated across the Site and is inconsistent in its treatment of similar hazards, (5) the present cleanup strategy is not cost-effective from a risk-reduction or future land use perspective, and (6) the milestones and activities in the Tri-Party Agreement cannot be achieved with an anticipated funding of 1.05 billion dollars annually. The risk-based strategies described herein were developed through a systems analysis approach that (1) analyzed the cleanup mission; (2) identified cleanup objectives, including risk reduction, land use, and mortgage reduction; (3) analyzed the existing baseline cleanup strategy from a cost and risk perspective; (4) developed alternatives for accomplishing the cleanup mission; (5) compared those alternatives against cleanup objectives; and (6) produced conclusions and recommendations regarding the current strategy and potential risk-based strategies.

  16. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  17. Auxiliary feedwater system risk-based inspection guide for the South Texas Project nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1993-12-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. South Texas Project was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the South Texas Project plant.

  18. Auxiliary feedwater system risk-based inspection guide for the H. B. Robinson nuclear power plant

    SciTech Connect

    Moffitt, N.E.; Lloyd, R.C.; Gore, B.F.; Vo, T.V.; Garner, L.W.

    1993-08-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. H. B. Robinson was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the H. B. Robinson plant.

  19. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant.

  20. Auxiliary feedwater system risk-based inspection guide for the J. M. Farley Nuclear Power Plant

    SciTech Connect

    Vo, T.V.; Pugh, R.; Gore, B.F.; Harrison, D.G. )

    1990-10-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment(PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. J. M. Farley was selected as the second plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important at the J. M. Farley plant. 23 refs., 1 fig., 1 tab.

  1. Auxiliary feedwater system risk-based inspection guide for the Ginna Nuclear Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.; Moffitt, N.E. )

    1991-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Ginna was selected as the eighth plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Ginna plant. 23 refs., 1 fig., 1 tab.

  2. Auxiliary feedwater system risk-based inspection guide for the Byron and Braidwood nuclear power plants

    SciTech Connect

    Moffitt, N.E.; Gore, B.F.: Vo, T.V. )

    1991-07-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Byron and Braidwood were selected for the fourth study in this program. The produce of this effort is a prioritized listing of AFW failures which have occurred at the plants and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Byron/Braidwood plants. 23 refs., 1 fig., 1 tab.

  3. Risk-based inspection priorities for PWR high-pressure injection system components

    SciTech Connect

    Vo, T.V.; Simonen, F.A.; Phan, H.K. )

    1993-01-01

    Under U.S. Nuclear Regulatory Commission sponsorship, Pacific Northwest Laboratory developed a risk-based method that can be used to establish in-service inspection priorities for nuclear power plant components. The overall goal of this effort was to develop technical bases for improvements of inspection plans and to provide recommendations for revisions of the American Society of Mechanical Engineers Boiler and Pressure Vessel Code, Sec. XI. The developed method used results of probabilistic risk assessment in combination with the failure modes and effects analysis (FMEA) technique to establish in-service inspection priorities for systems and components. The Surry nuclear power station, unit 1 (Surry-1) was selected for study. Inspection priorities for several pressure boundary systems at Surry-1 were determined in the early phase of the project. To complete the study, the remaining safety systems, plus balance of plant, have been analyzed; one of these is the high-pressure injection (HPI) system. This paper presents the results of inspection priorities for the HPI system.

  4. Auxiliary feedwater system risk-based inspection guide for the North Anna nuclear power plants

    SciTech Connect

    Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V. )

    1992-10-01

    In a study sponsored by the US Nuclear regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. North Anna was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the North Anna plant.

  5. Auxiliary feedwater system risk-based inspection guide for the Point Beach nuclear power plant

    SciTech Connect

    Lloyd, R C; Moffitt, N E; Gore, B F; Vo, T V; Vehec, T A

    1993-02-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Point Beach was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRS. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Point Beach plant.

  6. Probabilistic forecasts based on radar rainfall uncertainty

    NASA Astrophysics Data System (ADS)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.

  7. Probabilistic Route Selection Algorithm for IP Traceback

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Bin; Jung, Jae-Il

    DoS(Denial of Service) or DDoS(Distributed DoS) attack is a major threaten and the most difficult problem to solve among many attacks. Moreover, it is very difficult to find a real origin of attackers because DoS/DDoS attacker uses spoofed IP addresses. To solve this problem, we propose a probabilistic route selection traceback algorithm, namely PRST, to trace the attacker's real origin. This algorithm uses two types of packets such as an agent packet and a reply agent packet. The agent packet is in use to find the attacker's real origin and the reply agent packet is in use to notify to a victim that the agent packet is reached the edge router of the attacker. After attacks occur, the victim generates the agent packet and sends it to a victim's edge router. The attacker's edge router received the agent packet generates the reply agent packet and send it to the victim. The agent packet and the reply agent packet is forwarded refer to probabilistic packet forwarding table (PPFT) by routers. The PRST algorithm runs on the distributed routers and PPFT is stored and managed by routers. We validate PRST algorithm by using mathematical approach based on Poisson distribution.

  8. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  9. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  10. Probabilistic Techniques for Phrase Extraction.

    ERIC Educational Resources Information Center

    Feng, Fangfang; Croft, W. Bruce

    2001-01-01

    This study proposes a probabilistic model for automatically extracting English noun phrases for indexing or information retrieval. The technique is based on a Markov model, whose initial parameters are estimated by a phrase lookup program with a phrase dictionary, then optimized by a set of maximum entropy parameters. (Author/LRW)

  11. Designing Probabilistic Tasks for Kindergartners

    ERIC Educational Resources Information Center

    Skoumpourdi, Chrysanthi; Kafoussi, Sonia; Tatsis, Konstantinos

    2009-01-01

    Recent research suggests that children could be engaged in probability tasks at an early age and task characteristics seem to play an important role in the way children perceive an activity. To this direction in the present article we investigate the role of some basic characteristics of probabilistic tasks in their design and implementation. In…

  12. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  13. 12 CFR 1022.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Content, form, and timing of risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing...

  14. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  15. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  16. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  17. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false How to report your risk-based capital... report your risk-based capital determination. (a) Your risk-based capital report must contain at least... in written instructions to you. (b) You must submit each risk-based capital report in such format...

  18. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  19. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  20. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  1. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  2. Fuzzy logic and a risk-based graded approach for developing S/RIDs: An introduction

    SciTech Connect

    Wayland, J.R.

    1996-01-01

    A Standards/Requirements Identification Document (S/RID) is the set of expressed performance expectations, or standards, for a facility. Critical to the development of an integrated standards-based management is the identification of a set of necessary and sufficient standards from a selected set of standards/requirements. There is a need for a formal, rigorous selection process for the S/RIDs. This is the first of three reports that develop a fuzzy logic selection process. In this report the fundamentals of fuzzy logic are discussed as they apply to a risk-based graded approach.

  3. Comprehensive Risk-Based Diagnostically Driven Treatment Planning: Developing Sequentially Generated Treatment.

    PubMed

    Kois, Dean E; Kois, John C

    2015-07-01

    The clinical example presented in this article demonstrates a risk-based, diagnostically driven treatment planning approach by focusing on 4 key categories: periodontal, biomechanical, functional, dentofacial. In addition, our unique approach allowed the comprehensive clinical management of a patient with complex restorative needs. A full-mouth rehabilitation was completed sequentially without sacrificing the amount of dentistry necessary to restore health, comfort, function, and esthetics. The result exceeded the patient's expectation and was made financially possible by extending treatment over numerous years. PMID:26140967

  4. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  5. Probabilistic Fatigue And Flaw-Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Newlin, Laura; Ebbeler, Donald; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue and Flaw Propagation (PFAFAT II) package of software utilizing probabilistic failure-assessment (PFA) methodology to model flaw-propagation and low-cycle-fatigue modes of failure of structural components. Comprises one program for performing probabilistic crack-growth analysis and two programs for performing probabilistic low-cycle-fatigue analysis. These programs perform probabilistic fatigue and crack-propagation analysis by means of Monte Carlo simulation. PFAFAT II is extension of, rather than replacement for, PFAFAT software (NPO-18965). Written in FORTRAN 77.

  6. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  7. Risk based ISI application to a boiling water reactor

    SciTech Connect

    Smith, A.; Dimitrijevic, V.B.; O`Regan, P.J.

    1996-12-01

    The ASME Section XI Working Group on Implementation of Risk-Based Examination produced a code case to define risk-based selection rules that could be used for In-Service Inspection (ISI) of Class 1, 2, and 3 piping. To provide guidelines for practical implementation of the code case, EPRI sponsored work to develop evaluation procedures and criteria. As part of an EPRI sponsored pilot study, these procedures have been applied to a BWR plant. Piping within the scope of the existing Section XI program has been analyzed. The results of this effort indicate that implementation of RBISI programs can significantly reduce the cost and radiation exposure associated with in-service inspections. The revised program was compared to the previous program and a net gain in safety benefit was demonstrated.

  8. Risk-based decision making for terrorism applications.

    PubMed

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application. PMID:19187486

  9. Coupling risk-based remediation with innovative technology

    SciTech Connect

    Goodheart, G.F.; Teaf, C.M. |; Manning, M.J.

    1998-05-01

    Tiered risk-based cleanup approaches have been effectively used at petroleum sites, pesticide sites and other commercial/industrial facilities. For example, the Illinois Environmental Protection Agency (IEPA) has promulgated guidance for a Tiered Approach to Corrective action Objectives (TACO) to establish site-specific remediation goals for contaminated soil and groundwater. As in the case of many other state programs, TACO is designed to provide for adequate protection of human health and the environment based on potential risks posed by site conditions. It also incorporates site-related information that may allow more cost-effective remediation. IEPA developed TACO to provide flexibility to site owners/operators when formulating site-specific remediation activities, as well as to hasten property redevelopment to return sites to more productive use. Where appropriate, risk-based cleanup objectives as set by TACO-type programs may be coupled with innovative remediation technologies such as air sparging, bioremediation and soil washing.

  10. Protecting the Smart Grid: A Risk Based Approach

    SciTech Connect

    Clements, Samuel L.; Kirkham, Harold; Elizondo, Marcelo A.; Lu, Shuai

    2011-10-10

    This paper describes a risk-based approach to security that has been used for years in protecting physical assets, and shows how it could be modified to help secure the digital aspects of the smart grid and control systems in general. One way the smart grid has been said to be vulnerable is that mass load fluctuations could be created by quickly turning off and on large quantities of smart meters. We investigate the plausibility.

  11. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  12. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  13. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  14. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  15. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  16. Probabilistic Cloning and Quantum Computation

    NASA Astrophysics Data System (ADS)

    Gao, Ting; Yan, Feng-Li; Wang, Zhi-Xi

    2004-06-01

    We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning. In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.

  17. Risk-based analyses in support of California hazardous site remediation

    SciTech Connect

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year`s activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs` capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis.

  18. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  19. A pilot application of risk-based methods to establish in-service inspection priorities for nuclear components at Surry Unit 1 Nuclear Power Station

    SciTech Connect

    Vo, T.; Gore, B.; Simonen, F.; Doctor, S.

    1994-08-01

    As part of the Nondestructive Evaluation Reliability Program sponsored by the US Nuclear Regulatory Commission, the Pacific Northwest Laboratory is developing a method that uses risk-based approaches to establish in-service inspection plans for nuclear power plant components. This method uses probabilistic risk assessment (PRA) results and Failure Modes and Effects Analysis (FEMA) techniques to identify and prioritize the most risk-important systems and components for inspection. The Surry Nuclear Power Station Unit 1 was selected for pilot applications of this method. The specific systems addressed in this report are the reactor pressure vessel, the reactor coolant, the low-pressure injection, and the auxiliary feedwater. The results provide a risk-based ranking of components within these systems and relate the target risk to target failure probability values for individual components. These results will be used to guide the development of improved inspection plans for nuclear power plants. To develop inspection plans, the acceptable level of risk from structural failure for important systems and components will be apportioned as a small fraction (i.e., 5%) of the total PRA-estimated risk for core damage. This process will determine target (acceptable) risk and target failure probability values for individual components. Inspection requirements will be set at levels to assure that acceptable failure probabilistics are maintained.

  20. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  1. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  2. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  3. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at NASA-Lewis, and consists of five program elements: (1) probabilistic loads, (2) probabilistic finite element analysis, (3) probabilistic material behavior, (4) assessment of reliability and risk, and (5) probabilistic structural performance evaluation. Attention is given to quantification of the effects of uncertainties for several variables on High Pressure Fuel Turbopump blade temperature, pressure, and torque of the Space Shuttle Main Engine; the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; evaluation of the failure probability; reliability and risk-cost assessment; and an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  4. Investigation of probabilistic optimization for tomotherapy.

    PubMed

    Kissick, Michael W; Mackie, Thomas R; Flynn, Ryan T; Mo, Xiaohu; Campos, David D; Yan, Yue; Zhao, Donghui

    2012-01-01

    This work builds on a suite of studies related to the 'interplay', or lack thereof, for respiratory motion with helical tomotherapy (HT). It helps explain why HT treatments without active motion management had clinical outcomes that matched positive expectations. An analytical calculation is performed to illuminate the frequency range for which interplay-type dose errors could occur. Then, an experiment is performed which completes a suite of tests. The experiment shows the potential for a stable motion probability distribution function (PDF) with HT and respiratory motion. This PDF enables one to use a motion-robust or probabilistic optimization to intrinsically include respiratory motion into the treatment planning. The reason why HT is robust to respiratory motion is related to the beam modulation sampling of the tumor motion. Because active tracking-based motion management is more complicated for a variety of reasons, HT optimization that is robust to motion is a useful alternative for those many patients that cannot benefit from active motion management. PMID:22955654

  5. GIS Development of Probabilistic Tsunami Hazard Maps

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Geist, E. L.; Venturato, A. J.

    2004-12-01

    Probabilistic tsunami hazard mapping is best performed using geographic information systems (GIS), where multiple model-based inundation maps can be combined according to assigned probabilities. To test these techniques, hazard mapping is performed at Seaside, Oregon, the site of a pilot study that is part of the Federal Emergency Management Agency's (FEMA) effort to modernize its Flood Insurance Rate Maps (FIRMs). Because of the application of the study to FIRMs, we focus on developing aggregate hazard values (e.g., inundation area, flow depth) for the 1% and 0.2% annual probability events, otherwise known as the 100-year and 500-year floods. Both far-field and local tsunami sources are considered, each with assigned probability parameters. For an assumed time-independent (Poissonian) model, the only probability parameter needed is the mean inter-event time of the source under consideration. For a time-dependent model, the probability parameters include the time to the last event, the mean inter-event time, and a measure of recurrence aperiodicity. The main input for the model consists of far-field and local inundation maps, which represent maximum inundation values on land modeled for different combinations of earthquake magnitude and distance to earthquake source. The maps are rendered as raster grids, which lend themselves to algebraic functions as numerical arrays. One approach to determine the 100-year or 500-year inundation line is to calculate the maximum spatial extent of the input inundation maps. Alternatively, probabilistic flow depths can be determined by estimating a frequency-flow depth regression relationship for all of the layers at any given spatial point and interpolating the 100-year or 500-year value. The flow depths and accompanying inundation lines will be provided as map data layers reflecting the impact of tsunamis on the process of modernizing the FEMA Flood Insurance Rate Maps. In addition this type of analysis can be expanded to other

  6. DEVELOPMENT OF RISK-BASED AND TECHNOLOGY-INDEPENDENT SAFETY CRITERIA FOR GENERATION IV SYSTEMS

    SciTech Connect

    William E. Kastenberg; Edward Blandford; Lance Kim

    2009-03-31

    This project has developed quantitative safety goals for Generation IV (Gen IV) nuclear energy systems. These safety goals are risk based and technology independent. The foundations for a new approach to risk analysis has been developed, along with a new operational definition of risk. This project has furthered the current state-of-the-art by developing quantitative safety goals for both Gen IV reactors and for the overall Gen IV nuclear fuel cycle. The risk analysis approach developed will quantify performance measures, characterize uncertainty, and address a more comprehensive view of safety as it relates to the overall system. Appropriate safety criteria are necessary to manage risk in a prudent and cost-effective manner. This study is also important for government agencies responsible for managing, reviewing, and for approving advanced reactor systems because they are charged with assuring the health and safety of the public.

  7. Application impact analysis: a risk-based approach to business continuity and disaster recovery.

    PubMed

    Epstein, Beth; Khan, Dawn Christine

    2014-01-01

    There are many possible disruptions that can occur in business. Overlooking or under planning for Business Continuity requires time, understanding and careful planning. Business Continuity Management is far more than producing a document and declaring business continuity success. What is the recipe for businesses to achieve continuity management success? Application Impact Analysis is a method for understanding the unique Business Attributes. This AIA Cycle involves a risk based approach to understanding the business priority and considering business aspects such as Financial, Operational, Service Structure, Contractual Legal, and Brand. The output of this analysis provides a construct for viewing data, evaluating impact, and delivering results, for an approved valuation of Recovery Time Objectives (RTO). PMID:24578024

  8. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. PMID:24290823

  9. Application of risk-based methods to inservice testing of check valves

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; McAllister, W.J.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice testing (IST) programs for pumps and valves in nuclear steam supply systems. This paper discusses a pilot application of these methods to the inservice testing of check valves in the emergency core cooling system of Georgia Power`s Vogtle nuclear power station. The results of the probabilistic safety assessment (PSA) are used to divide the check valves into risk-significant and less-risk-significant groups. This information is reviewed by a plant expert panel along with the consideration of appropriate deterministic insights to finally categorize the check valves into more safety-significant and less safety-significant component groups. All of the more safety-significant check valves are further evaluated in detail using a failure modes and causes analysis (FMCA) to assist in defining effective IST strategies. A template has been designed to evaluate how effective current and emerging tests for check valves are in detecting failures or in finding significant conditions that are precursors to failure for the likely failure causes. This information is then used to design and evaluate appropriate IST strategies that consider both the test method and frequency. A few of the less safety-significant check valves are also evaluated using this process since differences exist in check valve design, function, and operating conditions. Appropriate test strategies are selected for each check valve that has been evaluated based on safety and cost considerations. Test strategies are inferred from this information for the other check valves based on similar check valve conditions. Sensitivity studies are performed using the PSA model to arrive at an overall IST program that maintains or enhances safety at the lowest achievable cost.

  10. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Thompson, Julie; Leclaire, Rene; Edward, Bryan; Jones, Edward

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by an integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.

  11. Probabilistic Analysis of Ground-Holding Strategies

    NASA Technical Reports Server (NTRS)

    Sheel, Minakshi

    1997-01-01

    The Ground-Holding Policy Problem (GHPP) has become a matter of great interest in recent years because of the high cost incurred by aircraft suffering from delays. Ground-holding keeps a flight on the ground at the departure airport if it is known it will be unable to land at the arrival airport. The GBPP is determining how many flights should be held on the ground before take-off and for how long, in order to minimize the cost of delays. When the uncertainty associated with airport landing capacity is considered, the GHPP becomes complicated. A decision support system that incorporates this uncertainty, solves the GHPP quickly, and gives good results would be of great help to air traffic management. The purpose of this thesis is to modify and analyze a probabilistic ground-holding algorithm by applying it to two common cases of capacity reduction. A graphical user interface was developed and sensitivity analysis was done on the algorithm, in order to see how it may be implemented in practice. The sensitivity analysis showed the algorithm was very sensitive to the number of probabilistic capacity scenarios used and to the cost ratio of air delay to ground delay. The algorithm was not particularly sensitive to the number of periods that the time horizon was divided into. In terms of cost savings, a ground-holding policy was the most beneficial when demand greatly exceeded airport capacity. When compared to other air traffic flow strategies, the ground-holding algorithm performed the best and was the most consistent under various situations. The algorithm can solve large problems quickly and efficiently on a personal computer.

  12. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  13. Probabilistic cloning of three nonorthogonal states

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Rui, Pinshu; Yang, Qun; Zhao, Yan; Zhang, Ziyun

    2015-04-01

    We study the probabilistic cloning of three nonorthogonal states with equal success probabilities. For simplicity, we assume that the three states belong to a special set. Analytical form of the maximal success probability for probabilistic cloning is calculated. With the maximal success probability, we deduce the explicit form of probabilistic quantum cloning machine. In the case of cloning, we get the unambiguous form of the unitary operation. It is demonstrated that the upper bound for probabilistic quantum cloning machine in (Qiu in J Phys A 35:6931, 2002) can be reached only if the three states are equidistant.

  14. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Content, form, and timing of risk-based pricing... Risk-Based Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice required by § 222.72(a) or (c) must include: (i)...

  15. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  16. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  17. Probabilistic approach to EMP assessment

    SciTech Connect

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program.

  18. 16 CFR 640.3 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 16 Commercial Practices 1 2012-01-01 2012-01-01 false General requirements for risk-based pricing... DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.3 General requirements for risk-based...

  19. 16 CFR 640.3 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 16 Commercial Practices 1 2013-01-01 2013-01-01 false General requirements for risk-based pricing... DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.3 General requirements for risk-based...

  20. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  1. 16 CFR 640.3 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 16 Commercial Practices 1 2011-01-01 2011-01-01 false General requirements for risk-based pricing... DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.3 General requirements for risk-based...

  2. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 12 Banks and Banking 3 2013-01-01 2013-01-01 false General requirements for risk-based pricing... Risk-Based Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general....

  3. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 12 Banks and Banking 3 2014-01-01 2014-01-01 false General requirements for risk-based pricing... Risk-Based Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general....

  4. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  5. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  6. 16 CFR 640.3 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... application, or in response to a solicitation under 12 CFR 226.5a, and more than a single possible purchase... 16 Commercial Practices 1 2014-01-01 2014-01-01 false General requirements for risk-based pricing... DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.3 General requirements for risk-based...

  7. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... report the risk-based capital level. (a) You must file a risk-based capital report with us each time you determine your risk-based capital level as required by § 652.80. (b) You must also report to us at once if you identify in the interim between quarterly or more frequent reports to us that you are not...

  8. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... report the risk-based capital level. (a) You must file a risk-based capital report with us each time you determine your risk-based capital level as required by § 652.80. (b) You must also report to us at once if you identify in the interim between quarterly or more frequent reports to us that you are not...

  9. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... report the risk-based capital level. (a) You must file a risk-based capital report with us each time you determine your risk-based capital level as required by § 652.80. (b) You must also report to us at once if you identify in the interim between quarterly or more frequent reports to us that you are not...

  10. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based assessments... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  11. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based assessments... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  12. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... report the risk-based capital level. (a) You must file a risk-based capital report with us each time you determine your risk-based capital level as required by § 652.80. (b) You must also report to us at once if you identify in the interim between quarterly or more frequent reports to us that you are not...

  13. 12 CFR 652.80 - When you must determine the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... your risk-based capital level at any time. (c) If you anticipate entering into any new business... 12 Banks and Banking 6 2010-01-01 2010-01-01 false When you must determine the risk-based capital... AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.80 When...

  14. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., including any business policy decisions or other assumptions made in implementing the risk-based capital... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How to report your risk-based capital... AGRICULTURAL MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.90 How...

  15. 12 CFR 652.75 - Your responsibility for determining the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the risk-based capital stress test and must be able to determine your risk-based capital level at any...-based capital level. 652.75 Section 652.75 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT... Requirements § 652.75 Your responsibility for determining the risk-based capital level. (a) You must...

  16. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  17. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  18. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  19. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  20. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-Based Capital Test Methodology and... OVERSIGHT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SAFETY AND SOUNDNESS CAPITAL Risk-Based Capital Pt. 1750, Subpt. B, App. A Appendix A to Subpart B of Part 1750—Risk-Based Capital Test Methodology...

  1. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-Based Capital Test Methodology and... OVERSIGHT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SAFETY AND SOUNDNESS CAPITAL Risk-Based Capital Pt. 1750, Subpt. B, App. A Appendix A to Subpart B of Part 1750—Risk-Based Capital Test Methodology...

  2. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Risk-Based Capital Guidelines; Market Risk B... ADEQUACY STANDARDS Pt. 3, App. B Appendix B to Part 3—Risk-Based Capital Guidelines; Market Risk Section... Application of the Market Risk Capital Rule Section 4Adjustments to the Risk-Based Capital Ratio...

  3. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  4. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  5. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  6. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  7. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  8. Risk-based selection of SSCs at Peach Bottom

    SciTech Connect

    Krueger, G.A.; Marie, A.J. )

    1993-01-01

    The purpose of identifying risk significant systems, structures, and components (SSCS) that are within the scope of the maintenance rule is to bring a higher level of attention to a subset of those SSCS. These risk-significant SSCs will have specific performance criteria established for them, and failure to meet this performance criteria will result in establishing goals to ensure the necessary improvement in performance. The Peach Bottom individual plant examination (IPE) results were used to provide insights for the verification of proposed probabilistic risk assessment (PRA) methods set forth in the Industry Maintenance Guidelines for Implementation of the Maintenance Rule. The objective of reviewing the methods for selection of SSCs that are considered risk significant was to ensure the methods used are logical, reproducible, and can be consistently applied.

  9. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  10. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  11. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  12. Compression of Probabilistic XML Documents

    NASA Astrophysics Data System (ADS)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  13. Risk-based assessment of the surety of information systems

    SciTech Connect

    Jansma, R.; Fletcher, S.; Halbgewachs, R.; Lim, J.; Murphy, M.; Sands, P.; Wyss, G.

    1995-03-01

    Correct operation of an information system requires a balance of ``surety`` domains -- access control (confidentiality), integrity, utility, availability, and safety. However, traditional approaches provide little help on how to systematically analyze and balance the combined impact of surety requirements on a system. The key to achieving information system surety is identifying, prioritizing, and mitigating the sources of risk that may lead to system failure. Consequently, the authors propose a risk assessment methodology that provides a framework to guide the analyst in identifying and prioritizing sources of risk and selecting mitigation techniques. The framework leads the analyst to develop a risk-based system model for balancing the surety requirements and quantifying the effectiveness and combined impact of the mitigation techniques. Such a model allows the information system designer to make informed trade-offs based on the most effective risk-reduction measures.

  14. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  15. Risk-based analytical method transfer: application to large multi-product transfers.

    PubMed

    Raska, Christina S; Bennett, Tony S; Goodberlet, Scott A

    2010-07-15

    As pharmaceutical companies adapt their business models, a new approach to analytical method transfer is needed to efficiently handle transfers of multiple products, associated with situations such as site consolidations/closures. Using the principles of risk management, a risk-based method transfer approach is described, which defines appropriate transfer activities based on a risk assessment of the methods and experience of the receiving unit. A key step in the process is detailed knowledge transfer from the transferring unit to the receiving unit. The amount of transfer testing required can be streamlined or eliminated on the basis of a number of factors, including method capability, receiving unit familiarity, and method past performance. PMID:20557030

  16. Homeland security R&D roadmapping : risk-based methodological options.

    SciTech Connect

    Brandt, Larry D.

    2008-12-01

    The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

  17. Environmental restoration risk-based prioritization work package planning and risk ranking methodology. Revision 2

    SciTech Connect

    Dail, J.L.; Nanstad, L.D.; White, R.K.

    1995-06-01

    This document presents the risk-based prioritization methodology developed to evaluate and rank Environmental Restoration (ER) work packages at the five US Department of Energy, Oak Ridge Field Office (DOE-ORO) sites [i.e., Oak Ridge K-25 Site (K-25), Portsmouth Gaseous Diffusion Plant (PORTS), Paducah Gaseous Diffusion Plant (PGDP), Oak Ridge National Laboratory (ORNL), and the Oak Ridge Y-12 Plant (Y-12)], the ER Off-site Program, and Central ER. This prioritization methodology was developed to support the increased rigor and formality of work planning in the overall conduct of operations within the DOE-ORO ER Program. Prioritization is conducted as an integral component of the fiscal ER funding cycle to establish program budget priorities. The purpose of the ER risk-based prioritization methodology is to provide ER management with the tools and processes needed to evaluate, compare, prioritize, and justify fiscal budget decisions for a diverse set of remedial action, decontamination and decommissioning, and waste management activities. The methodology provides the ER Program with a framework for (1) organizing information about identified DOE-ORO environmental problems, (2) generating qualitative assessments of the long- and short-term risks posed by DOE-ORO environmental problems, and (3) evaluating the benefits associated with candidate work packages designed to reduce those risks. Prioritization is conducted to rank ER work packages on the basis of the overall value (e.g., risk reduction, stakeholder confidence) each package provides to the ER Program. Application of the methodology yields individual work package ``scores`` and rankings that are used to develop fiscal budget requests. This document presents the technical basis for the decision support tools and process.

  18. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  19. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  20. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  1. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  2. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  3. Treatment of uncertainties in risk-based regulation

    SciTech Connect

    Camp, A.L.

    1994-12-31

    Over the past several years, we have begun to fully appreciate the value of probabilistic risk assessment (PRA) as a decision support tool. Risk assessment has allowed us to focus on problems of real importance and provided discipline to prioritization processes. Many government agencies now consider risk assessment as a normal and essential element of their decision making. Due to the inherent uncertainty in the results, use of absolute PRA numbers to make specific decisions was not encouraged by risk assessment experts in the past. The current trend , however, is to make more and more use of the absolute numbers and (in some cases) to use them directly to make regulatory decisions. Whether or not we approve of this trend, it will almost certainly continue. Therefore, it is incumbent upon risk assessment experts to provide information to decision-makers that is properly characterized. This characterization should include a clear presentation of the uncertainties involved in the risk estimates. Uncertainty analysis is often considered undesirable by decision-makers, because it muddies up the waters of an other-wise clear cut decision.

  4. Risk-based methods applicable to ranking conceptual designs

    SciTech Connect

    Breeding, R.J.; Ortiz, K.; Ringland, J.T.; Lim, J.J.

    1993-11-01

    In Ginichi Taguchi`s latest book on quality engineering, an emphasis is placed on robust design processes in which quality engineering techniques are brought ``upstream,`` that is, they are utilized as early as possible, preferably in the conceptual design stage. This approach was used in a study of possible future safety system designs for weapons. As an experiment, a method was developed for using probabilistic risk analysis (PRA) techniques to rank conceptual designs for performance against a safety metric for ultimate incorporation into a Pugh matrix evaluation. This represents a high-level UW application of PRA methods to weapons. As with most conceptual designs, details of the implementation were not yet developed; many of the components had never been built, let alone tested. Therefore, our application of risk assessment methods was forced to be at such a high level that the entire evaluation could be performed on a spreadsheet. Nonetheless, the method produced numerical estimates of safety in a manner that was consistent, reproducible, and scrutable. The results enabled us to rank designs to identify areas where returns on research efforts would be the greatest. The numerical estimates were calibrated against what is achievable by current weapon safety systems. The use of expert judgement is inescapable, but these judgements are explicit and the method is easily implemented on an spreadsheet computer program.

  5. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery. PMID:26017444

  6. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  7. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  8. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  9. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  10. Probabilistic assessments of climate change impacts on durum wheat in the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Ferrise, R.; Moriondo, M.; Bindi, M.

    2011-05-01

    Recently, the availability of multi-model ensemble prediction methods has permitted a shift from a scenario-based approach to a risk-based approach in assessing the effects of climate change. This provides more useful information to decision-makers who need probability estimates to assess the seriousness of the projected impacts. In this study, a probabilistic framework for evaluating the risk of durum wheat yield shortfall over the Mediterranean Basin has been exploited. An artificial neural network, trained to emulate the outputs of a process-based crop growth model, has been adopted to create yield response surfaces which are then overlaid with probabilistic projections of future temperature and precipitation changes in order to estimate probabilistic projections of future yields. The risk is calculated as the relative frequency of projected yields below a selected threshold. In contrast to previous studies, which suggest that the beneficial effects of elevated atmospheric CO2 concentration over the next few decades would outweigh the detrimental effects of the early stages of climatic warming and drying, the results of this study are of greater concern.

  11. Developing a risk-based air quality health index

    NASA Astrophysics Data System (ADS)

    Wong, Tze Wai; Tam, Wilson Wai San; Yu, Ignatius Tak Sun; Lau, Alexis Kai Hon; Pang, Sik Wing; Wong, Andromeda H. S.

    2013-09-01

    We developed a risk-based, multi-pollutant air quality health index (AQHI) reporting system in Hong Kong, based on the Canadian approach. We performed time series studies to obtain the relative risks of hospital admissions for respiratory and cardiovascular diseases associated with four air pollutants: sulphur dioxide, nitrogen dioxide, ozone, and particulate matter with an aerodynamic diameter less than 10 μm (PM10). We then calculated the sum of excess risks of the hospital admissions associated with these air pollutants. The cut-off points of the summed excess risk, for the issuance of different health warnings, were based on the concentrations of these pollutants recommended as short-term Air Quality Guidelines by the World Health Organization. The excess risks were adjusted downwards for young children and the elderly. Health risk was grouped into five categories and sub-divided into eleven bands, with equal increments in excess risk from band 1 up to band 10 (the 11th band is 'band 10+'). We developed health warning messages for the general public, including at-risk groups: young children, the elderly, and people with pre-existing cardiac or respiratory diseases. The new system addressed two major shortcomings of the current standard-based system; namely, the time lag between a sudden rise in air pollutant concentrations and the issue of a health warning, and the reliance on one dominant pollutant to calculate the index. Hence, the AQHI represents an improvement over Hong Kong's existing air pollution index.

  12. Risk Classification and Risk-based Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  13. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  14. Selecting a risk-based tool to aid in decision making

    SciTech Connect

    Bendure, A.O.

    1995-03-01

    Selecting a risk-based tool to aid in decision making is as much of a challenge as properly using the tool once it has been selected. Failure to consider customer and stakeholder requirements and the technical bases and differences in risk-based decision making tools will produce confounding and/or politically unacceptable results when the tool is used. Selecting a risk-based decisionmaking tool must therefore be undertaken with the same, if not greater, rigor than the use of the tool once it is selected. This paper presents a process for selecting a risk-based tool appropriate to a set of prioritization or resource allocation tasks, discusses the results of applying the process to four risk-based decision-making tools, and identifies the ``musts`` for successful selection and implementation of a risk-based tool to aid in decision making.

  15. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  16. Probabilistic model better defines development well risks

    SciTech Connect

    Connolly, M.R.

    1996-10-14

    Probabilistic techniques to compare and rank projects, such as the drilling of development wells, often are more representative than decision tree or deterministic approaches. As opposed to traditional deterministic methods, probabilistic analysis gives decision-makers ranges of outcomes with associated probabilities of occurrence. This article analyzes the drilling of a hypothetical development well with actual field data (such as stabilized initial rates, production declines, and gas/oil ratios) to calculate probabilistic reserves, and production flow streams. Analog operating data were included to build distributions for capital and operating costs. Economics from the Monte Carlo simulation include probabilistic production flow streams and cost distributions. Results include single parameter distributions (reserves, net present value, and profitability index) and time function distributions (annual production and net cash flow).

  17. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  18. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  19. A Probabilistic Formulation for Hausdorff Matching

    NASA Technical Reports Server (NTRS)

    Olson, Clark F.

    1998-01-01

    Matching images based on a Hausdorff measure has become popular for computer vision applications. In this paper, we develope a probabilistic formulation for Hausdorff matching in terms of maximum likelihood estimation.

  20. Emulation for probabilistic weather forecasting

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Barillec, Remi

    2010-05-01

    Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather

  1. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Content, form, and timing of risk-based pricing... Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice required by § 222.72(a) or (c) must include: (i) A statement that...

  2. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Stress Test

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-Based Capital Stress Test A Appendix A to... MORTGAGE CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements Pt. 652, Subpt. B, App. A Appendix A to Subpart B of Part 652— Risk-Based Capital Stress Test 1.0Introduction. 2.0Credit Risk....

  3. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  4. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  5. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  6. Simulating spatial and temporal varying CO2 signals from sources at the seafloor to help designing risk-based monitoring programs

    NASA Astrophysics Data System (ADS)

    Ali, Alfatih; Frøysa, Hâvard G.; Avlesen, Helge; Alendal, Guttorm

    2016-01-01

    Risk-based monitoring requires quantification of the probability of the design to detect the potentially adverse events. A component in designing the monitoring program will be to predict the varying signal caused by an event, here detection of a gas seep through the seafloor from an unknown location. The Bergen Ocean Model (BOM) is used to simulate dispersion of CO2 leaking from different locations in the North Sea, focusing on temporal and spatial variability of the CO2 concentration. It is shown that the statistical footprint depends on seep location and that this will have to be accounted for in designing a network of sensors with highest probability of detecting a seep. As a consequence, heterogeneous probabilistic predictions of CO2 footprints should be available to subsea geological CO2 storage projects in order to meet regulations.

  7. Risk-based decision support tools: protecting rail-centered transit corridors from cascading effects.

    PubMed

    Greenberg, Michael R; Lowrie, Karen; Mayer, Henry; Altiok, Tayfur

    2011-12-01

    We consider the value of decision support tools for passenger rail system managers. First, we call for models that follow events along main rail lines and then into the surrounding environment where they can cascade onto connected light rail, bus, auto, truck, and other transport modes. Second, we suggest that both probabilistic risk assessment (PRA-based) and agent-based models have a role to play at different scales of analysis and for different kinds of risks. Third, we argue that economic impact tools need more systematic evaluation. Fourth, we note that developers of decision support tools face a challenge of balancing their desire for theoretical elegance and the tendency to focus only on high consequence events against decisionmakers' mistrust of complex tools that they and their staff cannot manage and incorporate into their routine operations, as well as the high costs of developing, updating, and applying decision support tools to transport systems undergoing budget cuts and worker and service reductions. PMID:21564145

  8. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  9. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  10. Risk-based inspection in ASME Section XI

    SciTech Connect

    Lance, J.J.

    1996-12-01

    By 1970 the first edition of the ASME Code Section XI, Inservice Inspection of Nuclear Reactor Coolant Systems was published. From its inception, the Section XI inservice inspection scope was based on a fundamental risk-based selection process. In other words the inservice inspection scope included components where the consequences of a pressure boundary failure were high. Once the consequence significant system boundaries were established, inspections would then be performed at locations believed to be most susceptible service induced failure. Current Section XI requirements require that inspection locations be selected on the basis of peak stress and fatigue usage values contained in the Design Reports. These original stress calculations were designed to qualify a design and assure that the plant would provide reliable service throughout its design life. For the most part, the fatigue usage values in these reports do not provide an accurate measure of service life. As service history has demonstrated, the use of Design Report stresses and fatigue usage values can be misleading. The Section XI ISI inspection requirements have always been intended to focus inspections at those locations in the plant that pose the greater risk to reactor safety. This fundamental principle behind the Section XI inspection requirements has guided Section XI since its inception. However, today Utility resources are limited. The move in many states to deregulate utilities and growing competition from independent power producers is challenging Owners to reduce operating and maintenance cost without sacrificing safety. These programs should allow plants to focus limited resources on those locations where damage mechanisms are active and consequences are high. This will provide for efficient use of plants resources and improve safety.

  11. Probabilistic safety assessment for high-level waste tanks at Hanford

    SciTech Connect

    Sullivan, L.H.; MacFarlane, D.R.; Stack, D.W.

    1996-12-31

    Los Alamos National Laboratory has performed a comprehensive probabilistic safety assessment (PSA), including consideration of external events, for the 18 tank farms at the Hanford Tank Farm (HTF). This work was sponsored by the Department of Energy/Environmental Restoration and Waste Management Division (DOE/EM).

  12. MOND using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raut, Usha

    2009-05-01

    MOND has been proposed as a viable alternative to the dark matter hypothesis. In the original MOND formulation [1], a modification of Newtonian Dynamics was brought about by postulating new equations of particle motion at extremely low accelerations, as a possible explanation for the flat rotation curves of spiral galaxies. In this paper, we attempt a different approach to modify the usual force laws by trying to link gravity with the probabilistic aspects of quantum mechanics [2]. In order to achieve this, one starts by replacing the classical notion of a continuous distance between two elementary particles with a statistical probability function, π. The gravitational force between two elementary particles then can be interpreted in terms of the probability of interaction between them. We attempt to show that such a modified gravitational force would fall off a lot slower than the usual inverse square law predicts, leading to revised MOND equations. In the limit that the statistical aggregate of the probabilities becomes equal to the usual inverse square law force, we recover Newtonian/Einstein gravity.[3pt] [1] Milgrom, M. 1983, ApJ, 270, 365 [2] Goradia, S. 2002, .org/pdf/physics/0210040

  13. Advanced probabilistic method of development

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1987-01-01

    Advanced structural reliability methods are utilized on the Probabilistic Structural Analysis Methods (PSAM) project to provide a tool for analysis and design of space propulsion system hardware. The role of the effort at the University of Arizona is to provide reliability technology support to this project. PSAM computer programs will provide a design tool for analyzing uncertainty associated with thermal and mechanical loading, material behavior, geometry, and the analysis methods used. Specifically, reliability methods are employed to perform sensitivity analyses, to establish the distribution of a critical response variable (e.g., stress, deflection), to perform reliability assessment, and ultimately to produce a design which will minimize cost and/or weight. Uncertainties in the design factors of space propulsion hardware are described by probability models constructed using statistical analysis of data. Statistical methods are employed to produce a probability model, i.e., a statistical synthesis or summary of each design variable in a format suitable for reliability analysis and ultimately, design decisions.

  14. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  15. Dynamical systems probabilistic risk assessment.

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  16. Representation of probabilistic scientific knowledge

    PubMed Central

    2013-01-01

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO PMID:23734675

  17. Probabilistic Description of Stellar Ensembles

    NASA Astrophysics Data System (ADS)

    Cerviño, Miguel

    I describe the modeling of stellar ensembles in terms of probability distributions. This modeling is primary characterized by the number of stars included in the considered resolution element, whatever its physical (stellar cluster) or artificial (pixel/IFU) nature. It provides a solution of the direct problem of characterizing probabilistically the observables of stellar ensembles as a function of their physical properties. In addition, this characterization implies that intensive properties (like color indices) are intrinsically biased observables, although the bias decreases when the number of stars in the resolution element increases. In the case of a low number of stars in the resolution element (N<105), the distributions of intensive and extensive observables follow nontrivial probability distributions. Such a situation ​​​ can be computed by means of Monte Carlo simulations where data mining techniques would be applied. Regarding the inverse problem of obtaining physical parameters from observational data, I show how some of the scatter in the data provides valuable physical information since it is related to the system size (and the number of stars in the resolution element). However, making use of such ​​​ information requires following iterative procedures in the data analysis.

  18. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  19. 12 CFR 1022.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... direct-mail offer or a take-one application, or in response to a solicitation under 12 CFR 1026.60, and... 12 Banks and Banking 8 2012-01-01 2012-01-01 false General requirements for risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.72 General requirements...

  20. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... direct-mail offer or a take-one application, or in response to a solicitation under 12 CFR 226.5a, and... 12 Banks and Banking 3 2011-01-01 2011-01-01 false General requirements for risk-based pricing... Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general. Except as...

  1. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... direct-mail offer or a take-one application, or in response to a solicitation under 12 CFR 226.5a, and... 12 Banks and Banking 3 2012-01-01 2012-01-01 false General requirements for risk-based pricing... Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general. Except as...

  2. 12 CFR 1022.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... direct-mail offer or a take-one application, or in response to a solicitation under 12 CFR 1026.60, and... 12 Banks and Banking 8 2013-01-01 2013-01-01 false General requirements for risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.72 General requirements...

  3. 12 CFR 1022.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... direct-mail offer or a take-one application, or in response to a solicitation under 12 CFR 1026.60, and... 12 Banks and Banking 8 2014-01-01 2014-01-01 false General requirements for risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.72 General requirements...

  4. 12 CFR 390.466 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Risk-based capital credit risk-weight categories. 390.466 Section 390.466 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY REGULATIONS TRANSFERRED FROM THE OFFICE OF THRIFT SUPERVISION Capital § 390.466 Risk-based capital credit...

  5. 12 CFR 956.4 - Risk-based capital requirement for investments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement for investments... OFF-BALANCE SHEET ITEMS FEDERAL HOME LOAN BANK INVESTMENTS § 956.4 Risk-based capital requirement for... the investments multiplied by: (a) A factor associated with the credit rating of the investments...

  6. 12 CFR Appendix A to Part 3 - Risk-Based Capital Guidelines

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Risk-Based Capital Guidelines A Appendix A to Part 3 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY MINIMUM CAPITAL RATIOS; ISSUANCE OF DIRECTIVES Pt. 3, App. A Appendix A to Part 3—Risk-Based Capital Guidelines Section 1. Purpose, Applicability of Guidelines,...

  7. 12 CFR 702.103 - Applicability of risk-based net worth requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Applicability of risk-based net worth... AFFECTING CREDIT UNIONS PROMPT CORRECTIVE ACTION Net Worth Classification § 702.103 Applicability of risk-based net worth requirement. For purposes of § 702.102, a credit union is defined as “complex” and...

  8. 12 CFR 702.103 - Applicability of risk-based net worth requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AFFECTING CREDIT UNIONS PROMPT CORRECTIVE ACTION Net Worth Classification § 702.103 Applicability of risk... risk-based net worth requirement is applicable only if the credit union meets both of the following... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Applicability of risk-based net...

  9. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... You will perform the risk-based capital stress test as described in summary form below and as...) Data requirements. You will use the following data to implement the risk-based capital stress test. (1... capital stress test. (2) You will use Call Report data as the basis for Corporation data over the...

  10. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Risk-based capital requirement for acquired... ASSETS AND OFF-BALANCE SHEET ITEMS ACQUIRED MEMBER ASSETS § 955.6 Risk-based capital requirement for... losses as support for the credit risk of all AMA estimated by the Bank to represent a credit risk that...

  11. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Risk-based capital requirement for acquired... ASSETS AND OFF-BALANCE SHEET ITEMS ACQUIRED MEMBER ASSETS § 955.6 Risk-based capital requirement for... losses as support for the credit risk of all AMA estimated by the Bank to represent a credit risk that...

  12. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement for acquired... ASSETS AND OFF-BALANCE SHEET ITEMS ACQUIRED MEMBER ASSETS § 955.6 Risk-based capital requirement for... losses as support for the credit risk of all AMA estimated by the Bank to represent a credit risk that...

  13. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement for acquired... ASSETS AND OFF-BALANCE SHEET ITEMS ACQUIRED MEMBER ASSETS § 955.6 Risk-based capital requirement for... losses as support for the credit risk of all AMA estimated by the Bank to represent a credit risk that...

  14. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Prepayment of quarterly risk-based assessments. 327.12 Section 327.12 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  15. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Prepayment of quarterly risk-based assessments. 327.12 Section 327.12 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  16. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Prepayment of quarterly risk-based assessments. 327.12 Section 327.12 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  17. 12 CFR Appendix A to Part 3 - Risk-Based Capital Guidelines

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Risk-Based Capital Guidelines A Appendix A to Part 3 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY MINIMUM CAPITAL RATIOS; ISSUANCE OF DIRECTIVES Pt. 3, App. A Appendix A to Part 3—Risk-Based Capital Guidelines Section 1. Purpose, Applicability of Guidelines,...

  18. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  19. Neural representation of probabilistic information.

    PubMed

    Barber, M J; Clark, J W; Anderson, C H

    2003-08-01

    It has been proposed that populations of neurons process information in terms of probability density functions (PDFs) of analog variables. Such analog variables range, for example, from target luminance and depth on the sensory interface to eye position and joint angles on the motor output side. The requirement that analog variables must be processed leads inevitably to a probabilistic description, while the limited precision and lifetime of the neuronal processing units lead naturally to a population representation of information. We show how a time-dependent probability density rho(x; t) over variable x, residing in a specified function space of dimension D, may be decoded from the neuronal activities in a population as a linear combination of certain decoding functions phi(i)(x), with coefficients given by the N firing rates a(i)(t) (generally with D < N). We show how the neuronal encoding process may be described by projecting a set of complementary encoding functions phi;(i)(x) on the probability density rho(x; t), and passing the result through a rectifying nonlinear activation function. We show how both encoders phi;(i)(x) and decoders phi(i)(x) may be determined by minimizing cost functions that quantify the inaccuracy of the representation. Expressing a given computation in terms of manipulation and transformation of probabilities, we show how this representation leads to a neural circuit that can carry out the required computation within a consistent Bayesian framework, with the synaptic weights being explicitly generated in terms of encoders, decoders, conditional probabilities, and priors. PMID:14511515

  20. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  1. Learning probabilistic document template models via interaction

    NASA Astrophysics Data System (ADS)

    Ahmadullin, Ildus; Damera-Venkata, Niranjan

    2013-03-01

    Document aesthetics measures are key to automated document composition. Recently we presented a probabilistic document model (PDM) which is a micro-model for document aesthetics based on a probabilistic modeling of designer choice in document design. The PDM model comes with efficient layout synthesis algorithms once the aesthetic model is defined. A key element of this approach is an aesthetic prior on the parameters of a template encoding aesthetic preferences for template parameters. Parameters of the prior were required to be chosen empirically by designers. In this work we show how probabilistic template models (and hence the PDM cost function) can be learnt directly by observing a designer making design choices in composing sample documents. From such training data our learning approach can learn a quality measure that can mimic some of the design tradeoffs a designer makes in practice.

  2. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  3. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  4. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  5. Uncertainty in Coastal Inundation Mapping: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Leon, J. X.; Callaghan, D. P.; Heuvelink, G.; Mills, M.; Phinn, S. R.

    2014-12-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly as extreme high sea levels and associated erosion are forecasted to increase in magnitude. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis propagate into the inundation mapping. Error propagation within spatial modelling can be appropriately analysed using, for example, a probabilistic framework based on geostatistical simulations. Geostatistical modelling takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The aim of this study was to elaborate probability maps incorporating the impacts of spatially variable and spatially correlated elevation errors in high-resolution DEMs combined with sea level rise uncertainties. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. Sea level rise uncertainty was non-parametrically modelled using 1000 Monte Carlo estimations which were processed to provide the probability density function numerically. The sea level rise uncertainties were modelled using a Weibull distribution with 0.95 scale and 2.2 shape parameters. These uncertainties were combined through addition (i.e., assuming they are independent), and when using probability density distributions, requires a convolution. This probabilistic approach can be used in a risk-aversive decision making process by planning for

  6. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  7. Hanford Mission Plan risk-based prioritization methodologies

    SciTech Connect

    Hesser, W.A.; Madden, M.S.; Pyron, N.M.; Butcher, J.L.

    1994-08-01

    Sites across the US Department (DOE) complex recognize the critical need for a systematic method for prioritizing among their work scope activities. Here at the Hanford Site, Pacific Northwest Laboratory and Westinghouse Hanford Company (WHC) conducted preliminary research into techniques to meet this need and assist managers in making financial resource allocation decisions. This research is a subtask of the risk management task of the Hanford Mission Plan as described in the WHC Integrated Planning Work Breakdown Structure 1.8.2 Fiscal Year 1994 Work Plan. The research team investigated prioritization techniques used at other DOE sites and compared them with the Priority Planning Grid (PPG), a tool used at Hanford. The authors concluded that the PPG could be used for prioritization of resource allocation, but it needed to be revised to better reflect the Site`s priorities and objectives. The revised PPG was tested with three Hanford programs, the PPG was modified, and updated procedures were prepared.

  8. An evaluation of the role of risk-based decision-making in a former manufactured gas plant site remediation.

    PubMed

    Vyas, Vikram M; Gochfeld, Michael G; Georgopoulos, Panos G; Lioy, Paul J; Sussman, Nancy R

    2006-02-01

    Environmental remediation decisions are driven by the need to minimize human health and ecological risks posed by environmental releases. The Risk Assessment Guidance for Superfund Sites enunciates the principles of exposure and risk assessment that are to be used for reaching remediation decisions for sites under Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). Experience with remediation management under CERCLA has led to recognition of some crucial infirmities in the processes for managing remediation: cleanup management policies are ad hoc in character, mandates and practices are strongly conservative, and contaminant risk management occurs in an artificially narrow context. The purpose of this case study is to show how a policy of risk-based decision-making was used to avoid customary pitfalls in site remediation. This case study describes the risk-based decision-making process in a remedial action program at a former manufactured gas plant site that successfully achieved timely and effective cleanup. The remediation process operated outside the confines of the CERCLA process under an administrative consent order between the utility and the New Jersey Department of Environmental Protection. A residential use end state was negotiated as part of this agreement. The attendant uncertainties, complications, and unexpected contingencies were overcome by using the likely exposures associated with the desired end state to structure all of the remediation management decisions and by collecting site-specific information from the very outset to obtain a detailed and realistic characterization of human health risks that needed to be mitigated. The lessons from this case study are generalizable to more complicated remediation cases, when supported by correspondingly sophisticated technical approaches. PMID:16570377

  9. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  10. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    1999-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  11. Finite element methods in probabilistic mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing Kam; Mani, A.; Belytschko, Ted

    1987-01-01

    Probabilistic methods, synthesizing the power of finite element methods with second-order perturbation techniques, are formulated for linear and nonlinear problems. Random material, geometric properties and loads can be incorporated in these methods, in terms of their fundamental statistics. By construction, these methods are applicable when the scale of randomness is not too large and when the probabilistic density functions have decaying tails. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. Applications showing the effects of combined random fields and cyclic loading/stress reversal are studied and compared with Monte Carlo simulation results.

  12. HFIR vessel probabilistic fracture mechanics analysis

    SciTech Connect

    Cheverton, R.D.; Dickson, T.L.

    1997-01-01

    The life of the High Flux Isotope Reactor (HFIR) pressure vessel is limited by a radiation induced reduction in the material`s fracture toughness. Hydrostatic proof testing and probabilistic fracture mechanics analyses are being used to meet the intent of the ASME Code, while extending the life of the vessel well beyond its original design value. The most recent probabilistic evaluation is more precise and accounts for the effects of gamma as well as neutron radiation embrittlement. This analysis confirms the earlier estimates of a permissible vessel lifetime of at least 50 EFPY (100 MW).

  13. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  14. Is risk-based regulation feasible? The case of polybrominated diphenyl ethers (PBDEs).

    PubMed

    MacGillivray, Brian Hector; Alcock, Ruth E; Busby, Jerry

    2011-02-01

    The polybrominated diphenyl ethers (PBDEs) are a class of brominated flame retardants used extensively in an array of textiles and plastics. Initially viewed as inert and nontoxic, in recent years an emerging body of science has cast doubt on this perception. Consequently, the compounds have drawn sustained government, media, and lobby group focus in the United States and Europe, yet have taken contrasting trajectories in different risk regulation regimes. We present a longitudinal analysis of these pathways, examining the actions of legislatures, executives, courts, scientists, and pressure groups. We show that the emergence and resolution of PBDEs as a risk issue was strongly shaped by path dependency, political entrainment (inter-institutional conflict unrelated to PBDEs), and partisan lawmaking. This raises the question of whether risk-based principles are capable of being the foundation on which managing the potential for harm can be based--even when that harm is associated with specific objects like flame-retardant chemicals. We conclude by reflecting on the difficult normative issues that are raised. PMID:20880219

  15. Risk-Based Decision Process for Accelerated Closure of a Nuclear Weapons Facility

    SciTech Connect

    Butler, L.; Norland, R. L.; DiSalvo, R.; Anderson, M.

    2003-02-25

    Nearly 40 years of nuclear weapons production at the Rocky Flats Environmental Technology Site (RFETS or Site) resulted in contamination of soil and underground systems and structures with hazardous substances, including plutonium, uranium and hazardous waste constituents. The Site was placed on the National Priority List in 1989. There are more than 370 Individual Hazardous Substance Sites (IHSSs) at RFETS. Accelerated cleanup and closure of RFETS is being achieved through implementation and refinement of a regulatory framework that fosters programmatic and technical innovations: (1) extensive use of ''accelerated actions'' to remediate IHSSs, (2) development of a risk-based screening process that triggers and helps define the scope of accelerated actions consistent with the final remedial action objectives for the Site, (3) use of field instrumentation for real time data collection, (4) a data management system that renders near real time field data assessment, and (5) a regulatory agency consultative process to facilitate timely decisions. This paper presents the process and interim results for these aspects of the accelerated closure program applied to Environmental Restoration activities at the Site.

  16. Risk based in vitro performance assessment of extended release abuse deterrent formulations.

    PubMed

    Xu, Xiaoming; Gupta, Abhay; Al-Ghabeish, Manar; Calderon, Silvia N; Khan, Mansoor A

    2016-03-16

    High strength extended release opioid products, which are indispensable tools in the management of pain, are associated with serious risks of unintentional and potentially fatal overdose, as well as of misuse and abuse that might lead to addiction. The issue of drug abuse becomes increasingly prominent when the dosage forms can be readily manipulated to release a high amount of opioid or to extract the drug in certain products or solvents. One approach to deter opioid drug abuse is by providing novel abuse deterrent formulations (ADF), with properties that may be viewed as barriers to abuse of the product. However, unlike regular extended release formulations, assessment of ADF technologies are challenging, in part due to the great variety of formulation designs available to achieve deterrence of abuse by oral, parenteral, nasal and respiratory routes. With limited prior history or literature information, and lack of compendial standards, evaluation and regulatory approval of these novel drug products become increasingly difficult. The present article describes a risk-based standardized in-vitro approach that can be utilized in general evaluation of abuse deterrent features for all ADF products. PMID:26784976

  17. A Risk-based Model Predictive Control Approach to Adaptive Interventions in Behavioral Health.

    PubMed

    Zafra-Cabeza, Ascensión; Rivera, Daniel E; Collins, Linda M; Ridao, Miguel A; Camacho, Eduardo F

    2011-07-01

    This paper examines how control engineering and risk management techniques can be applied in the field of behavioral health through their use in the design and implementation of adaptive behavioral interventions. Adaptive interventions are gaining increasing acceptance as a means to improve prevention and treatment of chronic, relapsing disorders, such as abuse of alcohol, tobacco, and other drugs, mental illness, and obesity. A risk-based Model Predictive Control (MPC) algorithm is developed for a hypothetical intervention inspired by Fast Track, a real-life program whose long-term goal is the prevention of conduct disorders in at-risk children. The MPC-based algorithm decides on the appropriate frequency of counselor home visits, mentoring sessions, and the availability of after-school recreation activities by relying on a model that includes identifiable risks, their costs, and the cost/benefit assessment of mitigating actions. MPC is particularly suited for the problem because of its constraint-handling capabilities, and its ability to scale to interventions involving multiple tailoring variables. By systematically accounting for risks and adapting treatment components over time, an MPC approach as described in this paper can increase intervention effectiveness and adherence while reducing waste, resulting in advantages over conventional fixed treatment. A series of simulations are conducted under varying conditions to demonstrate the effectiveness of the algorithm. PMID:21643450

  18. Risk-based decision-making framework for the selection of sediment dredging option.

    PubMed

    Manap, Norpadzlihatun; Voulvoulis, Nikolaos

    2014-10-15

    The aim of this study was to develop a risk-based decision-making framework for the selection of sediment dredging option. Descriptions using case studies of the newly integrated, holistic and staged framework were followed. The first stage utilized the historical dredging monitoring data and the contamination level in media data into Ecological Risk Assessment phases, which have been altered for benefits in cost, time and simplicity. How Multi-Criteria Decision Analysis (MCDA) can be used to analyze and prioritize dredging areas based on environmental, socio-economic and managerial criteria was described for the next stage. The results from MCDA will be integrated into Ecological Risk Assessment to characterize the degree of contamination in the prioritized areas. The last stage was later described using these findings and analyzed using MCDA, in order to identify the best sediment dredging option, accounting for the economic, environmental and technical aspects of dredging, which is beneficial for dredging and sediment management industries. PMID:25108801

  19. Impact of Probabilistic Weather on Flight Routing Decisions

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel

    2006-01-01

    Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a

  20. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  1. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  2. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  3. A risk-based approach to robotic mission requirements

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Bourke, Roger D.

    1992-01-01

    A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.

  4. Risk-based assessment of the surety of information systems

    SciTech Connect

    Jansma, R.M.; Fletcher, S.K.; Murphy, M.D.; Lim, J.J.; Wyss, G.D.

    1996-07-01

    When software is used in safety-critical, security-critical, or mission-critical situations, it is imperative to understand and manage the risks involved. A risk assessment methodology and toolset have been developed which are specific to software systems and address a broad range of risks including security, safety, and correct operation. A unique aspect of this methodology is the use of a modeling technique that captures interactions and tradeoffs among risk mitigators. This paper describes the concepts and components of the methodology and presents its application to example systems.

  5. Risk-Based Disposal Plan for PCB Paint in the TRA Fluorinel Dissolution Process Mockup and Gamma Facilities Canal

    SciTech Connect

    R. A. Montgomery

    2008-05-01

    This Toxic Substances Control Act Risk-Based Polychlorinated Biphenyl Disposal plan was developed for the Test Reactor Area Fluorinel Dissolution Process Mockup and Gamma Facilities Waste System, located in Building TRA-641 at the Reactor Technology Complex, Idaho National Laboratory Site, to address painted surfaces in the empty canal under 40 CFR 761.62(c) for paint, and under 40 CFR 761.61(c) for PCBs that may have penetrated into the concrete. The canal walls and floor will be painted with two coats of contrasting non-PCB paint and labeled as PCB. The canal is covered with open decking; the access grate is locked shut and signed to indicate PCB contamination in the canal. Access to the canal will require facility manager permission. Protective equipment for personnel and equipment entering the canal will be required. Waste from the canal, generated during ultimate Decontamination and Decommissioning, shall be managed and disposed as PCB Bulk Product Waste.

  6. Holistic risk-based environmental decision making: a Native perspective.

    PubMed Central

    Arquette, Mary; Cole, Maxine; Cook, Katsi; LaFrance, Brenda; Peters, Margaret; Ransom, James; Sargent, Elvera; Smoke, Vivian; Stairs, Arlene

    2002-01-01

    Native American Nations have become increasingly concerned about the impacts of toxic substances. Although risk assessment and risk management processes have been used by government agencies to help estimate and manage risks associated with exposure to toxicants, these tools have many inadequacies and as a result have not served Native people well. In addition, resources have not always been adequate to address the concerns of Native Nations, and involvement of Native decision makers on a government-to-government basis in discussions regarding risk has only recently become common. Finally, because the definitions of health used by Native people are strikingly different from that of risk assessors, there is also a need to expand current definitions and incorporate traditional knowledge into decision making. Examples are discussed from the First Environment Restoration Initiative, a project that is working to address toxicant issues facing the Mohawk territory of Akwesasne. This project is developing a community-defined model in which health is protected at the same time that traditional cultural practices, which have long been the key to individual and community health, are maintained and restored. PMID:11929736

  7. Holistic risk-based environmental decision making: a Native perspective.

    PubMed

    Arquette, Mary; Cole, Maxine; Cook, Katsi; LaFrance, Brenda; Peters, Margaret; Ransom, James; Sargent, Elvera; Smoke, Vivian; Stairs, Arlene

    2002-04-01

    Native American Nations have become increasingly concerned about the impacts of toxic substances. Although risk assessment and risk management processes have been used by government agencies to help estimate and manage risks associated with exposure to toxicants, these tools have many inadequacies and as a result have not served Native people well. In addition, resources have not always been adequate to address the concerns of Native Nations, and involvement of Native decision makers on a government-to-government basis in discussions regarding risk has only recently become common. Finally, because the definitions of health used by Native people are strikingly different from that of risk assessors, there is also a need to expand current definitions and incorporate traditional knowledge into decision making. Examples are discussed from the First Environment Restoration Initiative, a project that is working to address toxicant issues facing the Mohawk territory of Akwesasne. This project is developing a community-defined model in which health is protected at the same time that traditional cultural practices, which have long been the key to individual and community health, are maintained and restored. PMID:11929736

  8. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  9. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  10. Translating Ensemble Weather Forecasts into Probabilistic User-Relevant Information

    NASA Astrophysics Data System (ADS)

    Steiner, Matthias; Sharman, Robert; Hopson, Thomas; Liu, Yubao; Chapman, Michael

    2010-05-01

    Weather-related decisions increasingly rely on probabilistic information as a means of assessing the risk of one potential outcome over another. Ensemble forecasting presents one of the key approaches trying to grasp the uncertainty of weather forecasting. Moreover, in the future decision makers will rely on tools that fully integrate weather information into the decision making process. Through these decision support tools, weather information will be translated into impact information. This presentation will highlight the translation of gridded ensemble weather forecasts into probabilistic user-relevant information. Examples will be discussed that relate to the management of air traffic, noise and pollution dispersion, missile trajectory prediction, water resources and flooding, wind energy production, and road maintenance. The primary take-home message from these examples will be that weather forecasts have to be tailored with a specific user perspective in mind rather than a "one fits all" approach, where a standard forecast product gets thrown over the fence and the user has to figure out what to do with it.

  11. Application of probabilistic ordinal optimization concepts to a continuous-variable probabilistic optimization problem.

    SciTech Connect

    Romero, Vicente Jose; Ayon, Douglas V.; Chen, Chun-Hung

    2003-09-01

    A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on crisp quantification of the alternatives. Thus, we simply ask the question: 'Is that alternative better or worse than this one?' to some level of statistical confidence we require, not: 'HOW MUCH better or worse is that alternative to this one?'. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus non-ordinal approaches for optimization under uncertainty.

  12. Dynamic fluctuations in dopamine efflux in the prefrontal cortex and nucleus accumbens during risk-based decision making.

    PubMed

    St Onge, Jennifer R; Ahn, Soyon; Phillips, Anthony G; Floresco, Stan B

    2012-11-21

    Mesocorticolimbic dopamine (DA) has been implicated in cost/benefit decision making about risks and rewards. The prefrontal cortex (PFC) and nucleus accumbens (NAc) are two DA terminal regions that contribute to decision making in distinct manners. However, how fluctuations of tonic DA levels may relate to different aspects of decision making remains to be determined. The present study measured DA efflux in the PFC and NAc with microdialysis in well trained rats performing a probabilistic discounting task. Selection of a small/certain option always delivered one pellet, whereas another, large/risky option yielded four pellets, with probabilities that decreased (100-12.5%) or increased (12.5-100%) across four blocks of trials. Yoked-reward groups were also included to control for reward delivery. PFC DA efflux during decision making decreased or increased over a session, corresponding to changes in large/risky reward probabilities. Similar profiles were observed from yoked-rewarded rats, suggesting that fluctuations in PFC DA reflect changes in the relative rate of reward received. NAc DA efflux also showed decreasing/increasing trends over the session during both tasks. However, DA efflux was higher during decision making on free- versus forced-choice trials and during periods of greater reward uncertainty. Moreover, changes in NAc DA closely tracked shifts in choice biases. These data reveal dynamic and dissociable fluctuations in PFC and NAc DA transmission associated with different aspects of risk-based decision making. PFC DA may signal changes in reward availability that facilitates modification of choice biases, whereas NAc DA encodes integrated signals about reward rates, uncertainty, and choice, reflecting implementation of decision policies. PMID:23175840

  13. Probabilistic Shock Iinitiation Thresholds and QMU Applications

    SciTech Connect

    Hrousis, C A; Gresshoff, M; Overturf, G E

    2009-04-10

    The Probabilistic Threshold Criterion (PTC) Project at LLNL develops phenomenological criteria for establishing margin of safety or performance margin on high explosive (HE) initiation in the high-speed impact regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PTC approaches start with the functional form of James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data. Recent work includes application of the PTC methodology to safety assessments involving a donor charge detonation and the need for assessment of a nearby acceptor charge's response, as well as flyer-acceptor configurations, with and without barriers. Results to date are in agreement with other less formal assessment protocols, and indicate a promising use for PTC-based assessments. In particular, there is interest in this approach because it supports the Quantified Margins and Uncertainties (QMU) framework for establishing confidence in the performance and/or safety of an HE system.

  14. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  15. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  16. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  17. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  18. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  19. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  20. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  1. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  2. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  3. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  4. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  5. 12 CFR 167.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Risk-based capital credit risk-weight categories. 167.6 Section 167.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 167.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  6. 12 CFR 167.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Risk-based capital credit risk-weight categories. 167.6 Section 167.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 167.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  7. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Risk-based capital credit risk-weight categories. 567.6 Section 567.6 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  8. 12 CFR 955.6 - Risk-based capital requirement for acquired member assets.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Risk-based capital requirement for acquired member assets. 955.6 Section 955.6 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK ASSETS AND OFF-BALANCE SHEET ITEMS ACQUIRED MEMBER ASSETS § 955.6 Risk-based capital requirement for acquired member assets. (a) General. Each...

  9. 12 CFR Appendix A to Part 325 - Statement of Policy on Risk-Based Capital

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Statement of Policy on Risk-Based Capital A Appendix A to Part 325 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY CAPITAL MAINTENANCE Pt. 325, App. A Appendix A to Part 325—Statement of Policy on Risk-Based Capital Capital adequacy is one...

  10. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Stress Test

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-Based Capital Stress Test A Appendix A to... Appendix A to Subpart B of Part 652— Risk-Based Capital Stress Test 1.0Introduction. 2.0Credit Risk. 2... in the Stress Test. 3.0Interest Rate Risk. 3.1Process for Calculating the Interest Rate Movement....

  11. Probabilistic exposure risk assessment with advective-dispersive well vulnerability criteria

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, Rainer; Nowak, Wolfgang; Helmig, Rainer

    2012-02-01

    Time-related advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. According to current water safety plans advanced risk management schemes are needed to better control and monitor all possible hazards within catchments. The goal of this work is to cast the four advective-dispersive intrinsic well vulnerability criteria by Frind et al. [1] into a framework of probabilistic risk assessment framework. These criteria are: (i) arrival time, (ii) level of peak concentration, (iii) time until first arrival of critical concentrations and (iv) exposure time. Our probabilistic framework yields catchment-wide maps of probabilities to not comply with these criteria. This provides indispensable information for catchment managers to perform probabilistic exposure risk assessment and thus improves the basis for risk-informed well-head management. We resolve heterogeneity with high-resolution Monte Carlo simulations and use a new reverse formulation of temporal moment transport equations to keep computational costs low. Our method is independent of dimensionality and boundary conditions, and can account for arbitrary sources of uncertainty. It can be coupled with any method for conditioning on available data. For simplicity, we demonstrate the concept on a 2D example that includes conditioning on synthetic data.

  12. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  13. How probabilistic risk assessment can mislead terrorism risk analysts.

    PubMed

    Brown, Gerald G; Cox, Louis Anthony Tony

    2011-02-01

    Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do. PMID:20846169

  14. Probabilistic Structural Health Monitoring of the Orbiter Wing Leading Edge

    NASA Technical Reports Server (NTRS)

    Yap, Keng C.; Macias, Jesus; Kaouk, Mohamed; Gafka, Tammy L.; Kerr, Justin H.

    2011-01-01

    A structural health monitoring (SHM) system can contribute to the risk management of a structure operating under hazardous conditions. An example is the Wing Leading Edge Impact Detection System (WLEIDS) that monitors the debris hazards to the Space Shuttle Orbiter s Reinforced Carbon-Carbon (RCC) panels. Since Return-to-Flight (RTF) after the Columbia accident, WLEIDS was developed and subsequently deployed on board the Orbiter to detect ascent and on-orbit debris impacts, so as to support the assessment of wing leading edge structural integrity prior to Orbiter re-entry. As SHM is inherently an inverse problem, the analyses involved, including those performed for WLEIDS, tend to be associated with significant uncertainty. The use of probabilistic approaches to handle the uncertainty has resulted in the successful implementation of many development and application milestones.

  15. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  16. Risk-Based Ranking Experiences for Cold War Legacy Facilities in the United States

    SciTech Connect

    Droppo, James G.

    2003-05-01

    Over the past two decades, a number of government agencies in the United States have faced increasing public scrutiny for their efforts to address the wide range of potential environmental issues related to Cold War legacies. Risk-based ranking was selected as a means of defining the relative importance of issues. Ambitious facility-wide risk-based ranking applications were undertaken. However, although facility-wide risk-based ranking efforts can build invaluable understanding of the potential issues related to Cold War legacies, conducting such efforts is difficult because of the potentially enormous scope and the potentially strong institutional barriers. The U.S. experience is that such efforts are worth undertaking to start building a knowledge base and infrastructure that are based on a thorough understanding of risk. In both the East and the West, the legacy of the Cold War includes a wide range of potential environmental issues associated with large industrial complexes of weapon production facilities. The responsible agencies or ministries are required to make decisions that could benefit greatly from information on the relative importance of these potential issues. Facility-wide risk-based ranking of potential health and environmental issues is one means to help these decision makers. The initial U.S. risk-based ranking applications described in this chapter were “ground-breaking” in that they defined new methodologies and approaches to meet the challenges. Many of these approaches fit the designation of a population-centred risk assessment. These U.S. activities parallel efforts that are just beginning for similar facilities in the countries of the former Soviet Union. As described below, conducting a facility-wide risk-based ranking has special challenges and potential pitfalls. Little guidance exists to conduct major risk-based rankings. For those considering undertaking such efforts, the material contained in this chapter should be useful

  17. A generic risk-based surveying method for invading plant pathogens.

    PubMed

    Parnell, S; Gottwald, T R; Riley, T; van den Bosch, F

    2014-06-01

    Invasive plant pathogens are increasing with international trade and travel, with damaging environmental and economic consequences. Recent examples include tree diseases such as sudden oak death in the Western United States and ash dieback in Europe. To control an invading pathogen it is crucial that newly infected sites are quickly detected so that measures can be implemented to control the epidemic. However, since sampling resources are often limited, not all locations can be inspected and locations must be prioritized for surveying. Existing approaches to achieve this are often species specific and rely on detailed data collection and parameterization, which is difficult, especially when new arrivals are unanticipated. Consequently regulatory sampling responses are often ad hoc and developed without due consideration of epidemiology, leading to the suboptimal deployment of expensive sampling resources. We introduce a flexible risk-based sampling method that is pathogen generic and enables available information to be utilized to develop epidemiologically informed sampling programs for virtually any biologically relevant plant pathogen. By targeting risk we aim to inform sampling schemes that identify high-impact locations that can be subsequently treated in order to reduce inoculum in the landscape. This "damage limitation" is often the initial management objective following the first discovery of a new invader. Risk at each location is determined by the product of the basic reproductive number (R0), as a measure of local epidemic size, and the probability of infection. We illustrate how the risk estimates can be used to prioritize a survey by weighting a random sample so that the highest-risk locations have the highest probability of selection. We demonstrate and test the method using a high-quality spatially and temporally resolved data set on Huanglongbing disease (HLB) in Florida, USA. We show that even when available epidemiological information is relatively

  18. EXAMPLE OF A RISK BASED DISPOSAL APPROVAL SOLIDIFICATION OF HANFORD SITE TRANSURANIC (TRU) WASTE

    SciTech Connect

    PRIGNANO AL

    2007-11-14

    The Hanford Site requested, and the U.S. Environmental Protection Agency (EPA) Region 10 approved, a Toxic Substances Control Act of 1976 (TSCA) risk-based disposal approval (RBDA) for solidifying approximately four cubic meters of waste from a specific area of one of the K East Basin: the North Loadout Pit (NLOP). The NLOP waste is a highly radioactive sludge that contained polychlorinated biphenyls (PCBs) regulated under TSCA. The prescribed disposal method for liquid PCB waste under TSCA regulations is either thermal treatment or decontamination. Due to the radioactive nature of the waste, however, neither thermal treatment nor decontamination was a viable option. As a result, the proposed treatment consisted of solidifying the material to comply with waste acceptance criteria at the Waste Isolation Pilot Plant (WPP) in Carlsbad, New Mexico, or possibly the Environmental Restoration Disposal Facility at the Hanford Site, depending on the resulting transuranic (TRU) content of the stabilized waste. The RBDA evaluated environmental risks associated with potential airborne PCBs. In addition, the RBDA made use of waste management controls already in place at the treatment unit. The treatment unit, the T Plant Complex, is a Resource Conservation and Recovery Act of 1976 (RCRA)-permitted facility used for storing and treating radioactive waste. The EPA found that the proposed activities did not pose an unreasonable risk to human health or the environment. Treatment took place from October 26,2005 to June 9,2006, and 332 208-liter (55-gallon) containers of solidified waste were produced. All treated drums assayed to date are TRU and will be disposed at WIPP.

  19. Example of a Risk-Based Disposal Approval: Solidification of Hanford Site Transuranic Waste

    SciTech Connect

    Barnes, B.M.; Hyatt, J.E.; Martin, P.W.; Prignano, A.L.

    2008-07-01

    The Hanford Site requested, and the U.S. Environmental Protection Agency (EPA) Region 10 approved, a Toxic Substances Control Act of 1976 (TSCA) risk-based disposal approval (RBDA) for solidifying approximately four cubic meters of waste from a specific area of one of the K East Basin: the North Loadout Pit (NLOP). The NLOP waste is a highly radioactive sludge that contained polychlorinated biphenyls (PCBs) regulated under TSCA. The prescribed disposal method for liquid PCB waste under TSCA regulations is either thermal treatment or decontamination. Due to the radioactive nature of the waste, however, neither thermal treatment nor decontamination was a viable option. As a result, the proposed treatment consisted of solidifying the material to comply with waste acceptance criteria at the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico, or possibly the Environmental Restoration Disposal Facility at the Hanford Site, depending on the resulting transuranic (TRU) content of the stabilized waste. The RBDA evaluated environmental risks associated with potential airborne PCBs. In addition, the RBDA made use of waste management controls already in place at the treatment unit. The treatment unit, the T Plant Complex, is a Resource Conservation and Recovery Act of 1976 (RCRA)-permitted facility used for storing and treating radioactive waste. The EPA found that the proposed activities did not pose an unreasonable risk to human health or the environment. Treatment took place from October 26, 2005 to June 9, 2006, and 332 208-liter (55-gallon) containers of solidified waste were produced. All treated drums assayed to date are TRU and will be disposed at WIPP. (authors)

  20. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  1. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  2. Probabilistic Safety Assessment of Tehran Research Reactor

    SciTech Connect

    Hosseini, Seyed Mohammad Hadi; Nematollahi, Mohammad Reza; Sepanloo, Kamran

    2004-07-01

    Probabilistic Safety Assessment (PSA) application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this paper the application of the Probabilistic Safety Assessment to the Tehran Research Reactor (TRR) is presented. The level 1 PSA application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantification, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using SAPHIRE software. This Study shows that the obtained core damage frequency for Tehran Research Reactor (8.368 E-6 per year) well meets the IAEA criterion for existing nuclear power plants (1E-4). But safety improvement suggestions are offered to decrease the most probable accidents. (authors)

  3. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  4. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  5. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  6. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  7. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  8. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  9. Significance testing as perverse probabilistic reasoning

    PubMed Central

    2011-01-01

    Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference. PMID:21356064

  10. Generalized probabilistic scale space for image restoration.

    PubMed

    Wong, Alexander; Mishra, Akshaya K

    2010-10-01

    A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation. PMID:20421184

  11. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  12. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  13. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  14. A hydrometeorological approach for probabilistic flood forecast

    NASA Astrophysics Data System (ADS)

    Siccardi, F.; Boni, G.; Ferraris, L.; Rudari, R.

    2005-03-01

    We propose a new methodology for evaluating predictive cumulative distribution functions (CDF) of ground effects for flood forecasting in mountainous environments. The methodology is based on the proper nesting of models suitable for probabilistic meteorological forecast, downscaling of rainfall, and hydrological modeling in order to provide a probabilistic prediction of ground effects of heavy rainfall events. Different ways of nesting are defined as function of the ratio between three typical scales: scales at which rainfall processes are satisfactory represented by meteorological models, scales of the hydrological processes, and scales of the social response. Two different examples of the application of the methodology for different hydrological scales are presented. Predictive CDFs are evaluated, and the motivations that lead to a different paths for CDFs derivation are highlighted.

  15. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  16. Probabilistic methods for robot motion determination

    NASA Technical Reports Server (NTRS)

    Balaram, J.

    1989-01-01

    The outline of a research effort in probabilistic approaches to robot motion determination is presented. A heuristic path planner that has met with considerable success in the JPL telerobot testbed is presented together with an interpretation of its performance. The relevance of techniques from stochastic geometry and stochastic diffusion is addressed, as is the possibility of using sensor data directly in the motion determination process.

  17. Fibre tracking: probabilistic approach and preliminary results

    NASA Astrophysics Data System (ADS)

    Torresin, A.; Moscato, A.; Minella, M.; Cardinale, F.; Minati, L.; Aquino, D.

    2009-01-01

    The aim of this work is to have a preliminary experience with probabilistic tractography. We performed fibres reconstruction for three tracts of interest with data obtained from two MR imaging units equipped with different gradients system. An acquisition protocol optimization has been necessary in order to obtain a good trade-off between image quality and data collection time. Possible solutions to acquisition and processing problems are discussed. Future developments and possible applications in neurosurgery are also suggested.

  18. Probabilistic Synthesis of Personal-Style Handwriting

    NASA Astrophysics Data System (ADS)

    Choi, Hyunil; Kim, Jin Hyung

    The goal of personal-style handwriting synthesis is to produce texts in the same style as an individual writer by analyzing the writer's samples of handwriting. The difficulty of handwriting synthesis is that the output should have the characteristics of the person's handwriting as well as looking natural, based on a limited number of available examples. We develop a synthesis algorithm which produces handwriting that exhibits naturalness based on the probabilistic character model.

  19. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  20. Transitions in a probabilistic interface growth model

    NASA Astrophysics Data System (ADS)

    Alves, S. G.; Moreira, J. G.

    2011-04-01

    We study a generalization of the Wolf-Villain (WV) interface growth model based on a probabilistic growth rule. In the WV model, particles are randomly deposited onto a substrate and subsequently move to a position nearby where the binding is strongest. We introduce a growth probability which is proportional to a power of the number ni of bindings of the site i: p_i\\propto n_i^\

  1. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  2. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  3. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts. PMID:26130236

  4. Probabilistic Slow Features for Behavior Analysis.

    PubMed

    Zafeiriou, Lazaros; Nicolaou, Mihalis A; Zafeiriou, Stefanos; Nikitidis, Symeon; Pantic, Maja

    2016-05-01

    A recently introduced latent feature learning technique for time-varying dynamic phenomena analysis is the so-called slow feature analysis (SFA). SFA is a deterministic component analysis technique for multidimensional sequences that, by minimizing the variance of the first-order time derivative approximation of the latent variables, finds uncorrelated projections that extract slowly varying features ordered by their temporal consistency and constancy. In this paper, we propose a number of extensions in both the deterministic and the probabilistic SFA optimization frameworks. In particular, we derive a novel deterministic SFA algorithm that is able to identify linear projections that extract the common slowest varying features of two or more sequences. In addition, we propose an expectation maximization (EM) algorithm to perform inference in a probabilistic formulation of SFA and similarly extend it in order to handle two and more time-varying data sequences. Moreover, we demonstrate that the probabilistic SFA (EM-SFA) algorithm that discovers the common slowest varying latent space of multiple sequences can be combined with dynamic time warping techniques for robust sequence time-alignment. The proposed SFA algorithms were applied for facial behavior analysis, demonstrating their usefulness and appropriateness for this task. PMID:26068878

  5. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. PMID:26215051

  6. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  7. Probabilistic drought classification using gamma mixture models

    NASA Astrophysics Data System (ADS)

    Mallya, Ganeshchandra; Tripathi, Shivam; Govindaraju, Rao S.

    2015-07-01

    Drought severity is commonly reported using drought classes obtained by assigning pre-defined thresholds on drought indices. Current drought classification methods ignore modeling uncertainties and provide discrete drought classification. However, the users of drought classification are often interested in knowing inherent uncertainties in classification so that they can make informed decisions. Recent studies have used hidden Markov models (HMM) for quantifying uncertainties in drought classification. The HMM method conceptualizes drought classes as distinct hydrological states that are not observed (hidden) but affect observed hydrological variables. The number of drought classes or hidden states in the model is pre-specified, which can sometimes result in model over-specification problem. This study proposes an alternate method for probabilistic drought classification where the number of states in the model is determined by the data. The proposed method adapts Standard Precipitation Index (SPI) methodology of drought classification by employing gamma mixture model (Gamma-MM) in a Bayesian framework. The method alleviates the problem of choosing a suitable distribution for fitting data in SPI analysis, quantifies modeling uncertainties, and propagates them for probabilistic drought classification. The method is tested on rainfall data over India. Comparison of the results with standard SPI show important differences particularly when SPI assumptions on data distribution are violated. Further, the new method is simpler and more parsimonious than HMM based drought classification method and can be a viable alternative for probabilistic drought classification.

  8. Probabilistic functional tractography of the human cortex.

    PubMed

    David, Olivier; Job, Anne-Sophie; De Palma, Luca; Hoffmann, Dominique; Minotti, Lorella; Kahane, Philippe

    2013-10-15

    Single-pulse direct electrical stimulation of cortical regions in patients suffering from focal drug-resistant epilepsy who are explored using intracranial electrodes induces cortico-cortical potentials that can be used to infer functional and anatomical connectivity. Here, we describe a neuroimaging framework that allows development of a new probabilistic atlas of functional tractography of the human cortex from those responses. This atlas is unique because it allows inference in vivo of the directionality and latency of cortico-cortical connectivity, which are still largely unknown at the human brain level. In this technical note, we include 1535 stimulation runs performed in 35 adult patients. We use a case of frontal lobe epilepsy to illustrate the asymmetrical connectivity between the posterior hippocampal gyrus and the orbitofrontal cortex. In addition, as a proof of concept for group studies, we study the probabilistic functional tractography between the posterior superior temporal gyrus and the inferior frontal gyrus. In the near future, the atlas database will be continuously increased, and the methods will be improved in parallel, for more accurate estimation of features of interest. Generated probabilistic maps will be freely distributed to the community because they provide critical information for further understanding and modelling of large-scale brain networks. PMID:23707583

  9. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  10. Policy-driven development of cost-effective, risk-based surveillance strategies.

    PubMed

    Reist, M; Jemmi, T; Stärk, K D C

    2012-07-01

    Animal health and residue surveillance verifies the good health status of the animal population, thereby supporting international free trade of animals and animal products. However, active surveillance is costly and time-consuming. The development of cost-effective tools for animal health and food hazard surveillance is therefore a priority for decision-makers in the field of veterinary public health. The assumption of this paper is that outcome-based formulation of standards, legislation leaving room for risk-based approaches and close collaboration and a mutual understanding and exchange between scientists and policy makers are essential for cost-effective surveillance. We illustrate this using the following examples: (i) a risk-based sample size calculation for surveys to substantiate freedom from diseases/infection, (ii) a cost-effective national surveillance system for Bluetongue using scenario tree modelling and (iii) a framework for risk-based residue monitoring. Surveys to substantiate freedom from infectious bovine rhinotracheitis and enzootic bovine leucosis between 2002 and 2009 saved over 6 million € by applying a risk-based sample size calculation approach, and by taking into account prior information from repeated surveys. An open, progressive policy making process stimulates research and science to develop risk-based and cost-efficient survey methodologies. Early involvement of policy makers in scientific developments facilitates implementation of new findings and full exploitation of benefits for producers and consumers. PMID:22265642

  11. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. PMID:26033352

  12. Probabilistic structural analysis methods for critical SSME propulsion components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The progress in the development of generic probabilistic models for various individual loads which consist of a steady state load, a periodic load, a random load, and a spike, is discussed. The capabilities of the Numerical Evaluation of Stochastic Structures Under Stress finite element code designed for probabilistic structural analysis of the SSME are examined. Variation principles for formulation probabilistic finite elements and a structural analysis for evaluating the geometric and material properties tolerances on the structural response of turbopump blades are being designed.

  13. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  14. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  15. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C. ); Michael, J.B. )

    1993-01-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  16. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C.; Michael, J.B.

    1993-05-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  17. Developing and evaluating distributions for probabilistic human exposure assessments

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.

    2002-08-01

    This report describes research carried out at the Lawrence Berkeley National Laboratory (LBNL) to assist the U. S. Environmental Protection Agency (EPA) in developing a consistent yet flexible approach for evaluating the inputs to probabilistic risk assessments. The U.S. EPA Office of Emergency and Remedial Response (OERR) recently released Volume 3 Part A of Risk Assessment Guidance for Superfund (RAGS), as an update to the existing two-volume set of RAGS. The update provides policy and technical guidance on performing probabilistic risk assessment (PRA). Consequently, EPA risk managers and decision-makers need to review and evaluate the adequacy of PRAs for supporting regulatory decisions. A critical part of evaluating a PRA is the problem of evaluating or judging the adequacy of input distributions PRA. Although the overarching theme of this report is the need to improve the ease and consistency of the regulatory review process, the specific objectives are presented in two parts. The objective of Part 1 is to develop a consistent yet flexible process for evaluating distributions in a PRA by identifying the critical attributes of an exposure factor distribution and discussing how these attributes relate to the task-specific adequacy of the input. This objective is carried out with emphasis on the perspective of a risk manager or decision-maker. The proposed evaluation procedure provides consistency to the review process without a loss of flexibility. As a result, the approach described in Part 1 provides an opportunity to apply a single review framework for all EPA regions and yet provide the regional risk manager with the flexibility to deal with site- and case-specific issues in the PRA process. However, as the number of inputs to a PRA increases, so does the complexity of the process for calculating, communicating and managing risk. As a result, there is increasing effort required of both the risk professionals performing the analysis and the risk manager

  18. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  19. Probabilistic alternatives to Bayesianism: the case of explanationism

    PubMed Central

    Douven, Igor; Schupbach, Jonah N.

    2015-01-01

    There has been a probabilistic turn in contemporary cognitive science. Far and away, most of the work in this vein is Bayesian, at least in name. Coinciding with this development, philosophers have increasingly promoted Bayesianism as the best normative account of how humans ought to reason. In this paper, we make a push for exploring the probabilistic terrain outside of Bayesianism. Non-Bayesian, but still probabilistic, theories provide plausible competitors both to descriptive and normative Bayesian accounts. We argue for this general idea via recent work on explanationist models of updating, which are fundamentally probabilistic but assign a substantial, non-Bayesian role to explanatory considerations. PMID:25964769

  20. Applications of probabilistic peak-shaving technique in generation planning

    SciTech Connect

    Malik, A.S.; Cory, B.J.; Wijayatunga, P.D.C.

    1999-11-01

    This paper presents two novel applications of probabilistic peak-shaving technique in generation planning, i.e., to simulate efficiently and accurately multiple limited-energy units probabilistically in equivalent load duration curve method and to simulate efficiently the candidate plants, whose different configurations are tested for finding the least-cost generation expansion planning solution. The applications of the technique are demonstrated with the help of two hand calculation examples. An efficient algorithm is also presented to simulate multiple limited-energy units probabilistically, for different hydrological conditions, in a generation mix of hydro-thermal units in probabilistic production costing framework.

  1. Nuclear power and probabilistic safety assessment (PSA): past through future applications

    NASA Astrophysics Data System (ADS)

    Stamatelatos, M. G.; Moieni, P.; Everline, C. J.

    1995-03-01

    Nuclear power reactor safety in the United States is about to enter a new era -- an era of risk- based management and risk-based regulation. First, there was the age of `prescribed safety assessment,' during which a series of design-basis accidents in eight categories of severity, or classes, were postulated and analyzed. Toward the end of that era, it was recognized that `Class 9,' or `beyond design basis,' accidents would need special attention because of the potentially severe health and financial consequences of these accidents. The accident at Three Mile Island showed that sequences of low-consequence, high-frequency events and human errors can be much more risk dominant than the Class 9 accidents. A different form of safety assessment, PSA, emerged and began to gain ground against the deterministic safety establishment. Eventually, this led to the current regulatory requirements for individual plant examinations (IPEs). The IPEs can serve as a basis for risk-based regulation and management, a concept that may ultimately transform the U.S. regulatory process from its traditional deterministic foundations to a process predicated upon PSA. Beyond the possibility of a regulatory environment predicated upon PSA lies the possibility of using PSA as the foundation for managing daily nuclear power plant operations.

  2. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  3. 77 FR 61446 - Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... COMMISSION Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors..., ``Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors.'' DATES: Submit comments by... No. ML081430087) concerning the review of probabilistic risk assessment (PRA) information and...

  4. Probabilistic design analysis using Composite Loads Spectra (CLS) coupled with Probabilistic Structural Analysis Methodologies (PSAM)

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H.

    1989-01-01

    Composite loads spectra (CLS) were applied to generate probabilistic loads for use in the PSAM nonlinear evaluation of stochastic structures under stress (NESSUS) finite element code. The CLS approach allows for quantifying loads as mean values and distributions around a central value rather than maximum or enveloped values typically used in deterministic analysis. NESSUS uses these loads to determine mean and perturbation responses. These results are probabilistically evaluated with the distributional information from CLS using a fast probabilistic integration (FPI) technique to define response distributions. The main example discussed describes a method of obtaining load descriptions and stress response of the second-stage turbine blade of the Space Shuttle Main Engine (SSME) high-pressure fuel turbopump (HPFTP). Additional information is presented on the on-going analysis of the high pressure oxidizer turbopump discharge duct (HPOTP) where probabilistic dynamic loads have been generated and are in the process of being used for dynamic analysis. Example comparisons of load analysis and engine data are furnished for partial verification and/or justification for the methodology.

  5. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  6. Probabilistic remote state preparation by W states

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Ming; Wang, Yu-Zhu

    2004-02-01

    In this paper we consider a scheme for probabilistic remote state preparation of a general qubit by using W states. The scheme consists of the sender, Alice and two remote receivers Bob and Carol. Alice performs a projective measurement on her qubit in the basis spanned by the state she wants to prepare and its orthocomplement. This allows either Bob or Carol to reconstruct the state with finite success probability. It is shown that for some special ensembles of qubits, the remote state preparation scheme requires only two classical bits, unlike the case in the scheme of quantum teleportation where three classical bits are needed.

  7. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  8. Automatic probabilistic knowledge acquisition from data

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1986-01-01

    A computer program for extracting significant correlations of attributes from masses of data is outlined. This information can then be used to develop a knowledge base for a probabilistic expert system. The method determines the best estimate of joint probabilities of attributes from data put into contingency table form. A major output from the program is a general formula for calculating any probability relation associated with the data. These probability relations can be utilized to form IF-THEN rules with associated probability, useful for expert systems.

  9. Bayesian Probabilistic Projection of International Migration.

    PubMed

    Azose, Jonathan J; Raftery, Adrian E

    2015-10-01

    We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model. PMID:26358699

  10. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. PMID:24209920

  11. Forum discussion on probabilistic structural analysis methods

    SciTech Connect

    Rodriguez, E.A.; Girrens, S.P.

    2000-10-01

    The use of Probabilistic Structural Analysis Methods (PSAM) has received much attention over the past several decades due in part to enhanced reliability theories, computational capabilities, and efficient algorithms. The need for this development was already present and waiting at the door step. Automotive design and manufacturing has been greatly enhanced because of PSAM and reliability methods, including reliability-based optimization. This demand was also present in the US Department of Energy (DOE) weapons laboratories in support of the overarching national security responsibility of maintaining the nations nuclear stockpile in a safe and reliable state.

  12. A periodic probabilistic photonic cluster state generator

    NASA Astrophysics Data System (ADS)

    Fanto, Michael L.; Smith, A. Matthew; Alsing, Paul M.; Tison, Christopher C.; Preble, Stefan F.; Lott, Gordon E.; Osman, Joseph M.; Szep, Attila; Kim, Richard S.

    2014-10-01

    The research detailed in this paper describes a Periodic Cluster State Generator (PCSG) consisting of a monolithic integrated waveguide device that employs four wave mixing, an array of probabilistic photon guns, single mode sequential entanglers and an array of controllable entangling gates between modes to create arbitrary cluster states. Utilizing the PCSG one is able to produce a cluster state with nearest neighbor entanglement in the form of a linear or square lattice. Cluster state resources of this type have been proven to be able to perform universal quantum computation.

  13. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  14. Probabilistic sensitivity analysis in health economics.

    PubMed

    Baio, Gianluca; Dawid, A Philip

    2015-12-01

    Health economic evaluations have recently become an important part of the clinical and medical research process and have built upon more advanced statistical decision-theoretic foundations. In some contexts, it is officially required that uncertainty about both parameters and observable variables be properly taken into account, increasingly often by means of Bayesian methods. Among these, probabilistic sensitivity analysis has assumed a predominant role. The objective of this article is to review the problem of health economic assessment from the standpoint of Bayesian statistical decision theory with particular attention to the philosophy underlying the procedures for sensitivity analysis. PMID:21930515

  15. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... in 16 CFR Part 698, Appendix B. Appropriate use of Model Form B-1 is deemed to comply with the... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of...

  16. 12 CFR Appendix A to Part 3 - Risk-Based Capital Guidelines

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of numerous factors, including those listed in 12 CFR 3.10. With respect to the consideration of... with the provisions of 12 CFR part 3. (2) Effective December 31, 1990, these risk-based capital... assets ratios as required by 12 CFR part 3, and should begin preparing for the implementation of...

  17. 12 CFR 390.466 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Corporation Improvement Act of 1991 (12 U.S.C. 4401-4407), or Regulation EE (12 CFR part 231). (3) If the... Administration in 13 CFR 121 pursuant to 15 U.S.C. 632. (ii) Capital requirement. Notwithstanding any other... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Risk-based capital credit...

  18. 12 CFR 390.466 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Corporation Improvement Act of 1991 (12 U.S.C. 4401-4407), or Regulation EE (12 CFR part 231). (3) If the... Administration in 13 CFR 121 pursuant to 15 U.S.C. 632. (ii) Capital requirement. Notwithstanding any other... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Risk-based capital credit...

  19. 76 FR 39885 - Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... SECURITY Coast Guard Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs) AGENCY... Drilling Units (MODUs). This policy letter announces changes to the Coast Guard's system used to prioritize inspections of foreign-flagged MODUs. DATES: This policy will become effective on July 7, 2011....

  20. 12 CFR Appendix A to Part 3 - Risk-Based Capital Guidelines

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of numerous factors, including those listed in 12 CFR 3.10. With respect to the consideration of... with the provisions of 12 CFR part 3. (2) Effective December 31, 1990, these risk-based capital... assets ratios as required by 12 CFR part 3, and should begin preparing for the implementation of...